You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/07/30 12:48:06 UTC

Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #810

See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/810/display/redirect>

Changes:


------------------------------------------
[...truncated 295.41 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Jul 30, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 30, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Jul 30, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Jul 30, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-523WX4eiJRa23FO8T1A3yhGB742T2XkgPzJB5lXjVlU.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-K4GyeDFEFVc3dVDy2YoE2N9fr9IdkHj9Uf4FPEHqkyo.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-gdes9MiXk0sOCRj1ACMoFkvLz_3SIVC6Iuknob93glE.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-kU5ybE7UBvfyyJ9q77gPKt-wnXwhT4VH42kR07v9n5o.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-523WX4eiJRa23FO8T1A3yhGB742T2XkgPzJB5lXjVlU.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-UU8H0MwtN77lITOyiSju3dah_FvVX6Q6-dRlbi8rfco.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-LWnvAPOW11md47mFZmyhr5q2R8_M_ci49szA_OLMfyY.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-n-bmS3XLlTBT7pqUibGq1_RGYG7KO1sAgEikZxz0bdE.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-p8RGx9Uuhno-hmqulXYrLREhIE_IXoKrG-Qga3f4f68.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-X7ZAjKibT9J5VnBpfLSzO2jW7ex_pbJcaEWdl8hMJ4s.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-PM-zN1lnXD8-o9xQ3cLNwZJ4kSMAf-L5MFLvvD60I8w.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT--FIDJK7qmFRsmNP7ZqpXpP8nHIQ6wng9f5HPAxmXyn8.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-KZc3Ao-VCOnGcHYlmJgeQJU-O24ahg4AogwPqURLVb0.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-RYPvHwBzbOAt2z55flxRnBz76H9wbdT9MKKI1hHxFio.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-efMFVVkU-VQy17BitGKxzmZyAPBu4YwtuQFt_DXLQqo.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-Q3YsVM_XSBNowuyeQIHzhVtxH-Wl_Dd7mC7ItFVs8WA.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-pDFmRErqwjNewl-HbWp_gnLuWyk7LOvAT9RR2u2zsCA.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-vU6nF0f3R7SI5oNy3QgyFBi9NX-nT5-DMITbEDy-NnM.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4352175275096372723.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WWhduTO7KMrfeBCmRVBuld_RoSYAfNIrJB9-qdg7rUw.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-JwLgjMIG35vlzC8F4prEW68scohcf6UlJVghIOo-p4I.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-HeOo2SHPG8LFcgMSQCbuwM-qmtgiSoT9atFzRUxU7Hs.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-dehDkuKk0XHowFpoNQSVTJYJ3r-o4FH_ZvIzPOMRGxk.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-7Rn0XebY9GthagpKmx3MQsNVUPOZKzBViJUGpf4ebL8.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-Y7B_GMM8TIW4YZp8DwehlON5NQyS94KINLchRZVLzzg.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-rnWgf86bK2ZQMKZM50gricDrupu4inDqa5K64VH2GAQ.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-gOQuK1plmPzxqzzmYsXdHlHy4uQbDtifkd0XGx6iuKg.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-iNjcqqFSwwz6nMrXmEJNwHRkspBCcz4M7ogymq-3qQw.jar
    Jul 30, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-wVMR5mEH963ZOaZP2ekO_rPd6ENILDvD2e8Ypojm1EA.jar
    Jul 30, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-6J6JVxBL90C1jqXaOND4Uj7TnKlPqKZ5hoJjaBS4tNc.jar
    Jul 30, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-M5SOVFSjlJr0ZJtyBK3lC6EDbd09l_zHTRzewbtZu48.jar
    Jul 30, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-n-SaJSLCFmCi7wrnOgmC-RJWPGLfRTPcuQSXKuyiE4w.jar
    Jul 30, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 4b267e96babfadc752555150e1f3a3a4a3d53a8a4d6a93e7e030a90307cb9fa0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SyZ-lrq_rcdSVVFQ4fOjpKPVOopNapPn4DCpAwfLn6A.pb
    Jul 30, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_05_45_29-17184679433290169806?project=apache-beam-testing
    Jul 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-30_05_45_29-17184679433290169806
    Jul 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-30_05_45_29-17184679433290169806
    Jul 30, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-30T12:45:29.556Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:37.999Z: Worker configuration: n1-standard-1 in us-central1-a.
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.085Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.115Z: Expanding GroupByKey operations into optimizable parts.
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.144Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.208Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.232Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.254Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.282Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.569Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 30, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:45:39.632Z: Starting 5 workers in us-central1-a...
    Jul 30, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:46:06.474Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Jul 30, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:46:06.508Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Jul 30, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-30T12:46:10.670Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 30, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:46:11.909Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Jul 30, 2020 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:46:32.878Z: Workers have started successfully.
    Jul 30, 2020 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:46:32.927Z: Workers have started successfully.
    Jul 30, 2020 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:47:07.380Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 30, 2020 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:47:07.588Z: Cleaning up.
    Jul 30, 2020 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:47:07.682Z: Stopping worker pool...
    Jul 30, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:47:58.354Z: Autoscaling: Resized worker pool from 5 to 0.
    Jul 30, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T12:47:58.409Z: Worker pool stopped.
    Jul 30, 2020 12:48:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-30_05_45_29-17184679433290169806 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 936b56ef-d10b-48b6-82bd-3099a8e0e697 and timestamp: 2020-07-30T12:48:04.360000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.212

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 30, 2020 12:48:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 48.987 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 48s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/b362vra6vjrpa

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1060

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1060/display/redirect?page=changes>

Changes:

[justinwhite321] Upgrade conscrypt to latest version

[Kyle Weaver] Revert "Fix up-to-date checking mechanism on Tox Task."

[Kenneth Knowles] Set default Python version for virtualenv to 3.6, the minimum supported

[noreply] [BEAM-9681] Add textio.Read lesson to Go SDK katas (#12941)

[noreply] [BEAM-10200] Respect profile_memory option and add memory profiler to…


------------------------------------------
[...truncated 279.52 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 01, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-hxHL7B003DAn5hhJvOBsS2VRNDzkYSjI7Xmu3LvhB2E.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-4oonz_kumL_cJ2qb-Xa5rCMofj87Bd9XHQ_sLIEVDYA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-HRlU4oEC7P5i3kwHBhwsON0vHih8AfY5OTPgdKydmzo.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-wsWHHi_NiL54IMhLW3mjXVBkv7dcUy-7DZznnswqFe0.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-evlG_MefN3WHSasnXF60OAJ3MJp81mNEHZHqIIoA9eA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-mHXB1MBUCp1_kePvf7SECsaHKQUrj2_HtvV3lt-GpVM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-IwXKT9pjhO82uCHcfbC_yOgfgWWi0YRsE4FCjk2SGHY.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-M46YeHv5Ga8MG3SfCMW4Xb6VNCiP6LmFEK7cR90NH4E.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-hxHL7B003DAn5hhJvOBsS2VRNDzkYSjI7Xmu3LvhB2E.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-R2_RDD19tv3qxsR3KAAly5o3XvKoUCbnfqbR7P-zLbc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--wwYNFfXF8pAzeumE40f89Xjo5FbUQX2LsyI1O18W2c.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-n1IoevSdAwTntbVPmfJBJeTslFW28YV9CL9Wxv006Kc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-XRjvBUsRfb0Ne9bNyfbHxmPnRm32-oU_TWaT9EC_svE.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests--wY6DAVJz1b15CIPJvqFIda0-EYrfxTcSidKY_dfg_8.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-BEFYmtIAP5SU0jr9Jt1WrW3LHmHCxHEdyGaQ1bFt6qA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-O5qwv2o2CnhR6eqm4_7cxG_9KpxZmQ2a3C6rVPK3Vn4.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-YW2bGgsDUhAEO32jdcGZvOiNXw2gcHZaS-9wzq0ybJU.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-8bFjEVsfoPaDyhrOzvMwjA2yJfrmoRTEFVwq64LFeYM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-GwpCV-xnGJ5W1dfH94AjEXIrAllmObua_VeZl7A-WB0.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-z2pcJyPeCNWrJN1f0PAzKKZcVAy_u2KK1nxFmP2DPsM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-qL04d_-4Pr9bJoEyURzLXe0cmuP8ivAv2RUeAX4qIfM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9192971367191216936.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VVtEGka3SBuf2A3I1mUMg81zeW4h6kYch31uNgUHptg.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-TyJTS71efSxbm-lC5fmX2C-5zQiUw26MUsMW7cw_A38.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--17jbi2-gPjLBeLWGZoC87BRtzK2Xm1nxbLabIn-eqA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8GRdz3sJamJDVdRPu6rsLQmQeI532faARfqw98rRJcQ.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-U-h8stcgJfKbkbpAbtS4eUb1VoZXE9KUS11Cc6LxEOc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-a87jqTWMWs7fw48XH0Y-MPHNt3pw5ao6PZ9V2oEt2yU.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-yCmiCvRsdcNZ4ki5cB5sQ8FEjDR7pLwvpZDbWJvLbCo.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-K1s7gOuVLHCcopzZ7p--7sJpV0GRoQL13A0mylCuau0.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-pY4Zf_ZKxzB8pncsQ2dE7g98_k2xf5nLdx39QYl7xvc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-_oJ3NsmlIUYC7Ap72PBAJNxkXtkS4Epf8PkrC8tE6Bo.jar
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash a1ec324a860d4df57cd630b221f533142a43c2e7254b05b6338e308d2c797f86> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oewySoYNTfV81jCyIfUzFCpDwuclSwW2M44wjSx5f4Y.pb
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_17_45_44-16062935913013719394?project=apache-beam-testing
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_17_45_44-16062935913013719394
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_17_45_44-16062935913013719394
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-01T00:45:44.046Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:50.625Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.272Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.314Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.341Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.410Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.429Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.451Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.483Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.940Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:52.001Z: Starting 5 workers in us-central1-c...
    Oct 01, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:15.020Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:16.921Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:37.038Z: Workers have started successfully.
    Oct 01, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:37.062Z: Workers have started successfully.
    Oct 01, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:47:05.727Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:47:05.859Z: Cleaning up.
    Oct 01, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:47:05.947Z: Stopping worker pool...
    Oct 01, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:48:00.238Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:48:00.283Z: Worker pool stopped.
    Oct 01, 2020 12:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_17_45_44-16062935913013719394 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b0f618f6-91c7-49dd-9815-4c661eaa92ed and timestamp: 2020-10-01T00:48:07.370000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.102

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 12:48:07 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 37.095 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 51s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/rq4kvhrcsquqy

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1059

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1059/display/redirect?page=changes>

Changes:

[zyichi] Add nexmark python query 10 to choices

[Kyle Weaver] [BEAM-10671] Add environment configuration fields as a repeated pipeline

[Pablo Estrada] Passing project properly in BQSource

[noreply] [BEAM-10986] Rollback to shadow 4.0.3 (#12969)


------------------------------------------
[...truncated 280.51 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zGr6hp_tOPjjTrGxPLWbUKxzR6jOJPm_s3yZ7SIIqEE.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT--6AOQwr_lhjcphHsAODaYhZx-kbR8xp48sqLixyw1lA.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aDjiKG7-VmZf2ohhA8eEY1z_grDIci1FR5INnvhsAy0.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-MInXDxCmFQYZ-R3ru4OYGwsomv6wVz0p_k4iwQ77jZ0.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-mj4Nm0a98QMj2i5XtabA54lunoEnugIGXImzdJa32dw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-o6_aBtCACqHcGlUMlIS8mlX0ui27m2KEEl31xwOrSvI.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-SFFsR7FJRxB6cnv6Ij2RQ8UH5xzkiW92HQ2iXr_lIjM.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-9ZJzYZd6YyT2kfkW9RaPM1OyOIbn9ERYt_5AHo6byGI.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-yWRtYmt4m4cn6CouWasm1vJiqn_bY6ygQt4v68-zFtw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-_3lbb06nnS6YV7Yjqvcr5Z46LpcZT00AOC5IPuzVHQM.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-md9PAUi7stQajKKxgqF1PN2WNXqrTOsXeaQxI_sB8tk.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-rxA40I4iCtvAhYL6ewJ6wjmOw0c6pSNqVoLKHk-bsMw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zGr6hp_tOPjjTrGxPLWbUKxzR6jOJPm_s3yZ7SIIqEE.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-VrqIKQDyb_K1P9vQvDz1GgCV5VFsHb9KefXB3Yn04fU.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-sOLtpusI33eqrk0Rsigsj42S7D-F7aA9xywjhu7bzl0.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-3jCjDTdqKrK5BwefnPLyCe0Iqtrf45DzRXQZvjzT5DY.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8pkolRL7me3Ub_GnipujJ3ckvwHDQM7vzrKc5LvOSlA.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-yJk28HO3tOqg5ojMVK0aMcMKh0fcrdtgcctJhnZZA58.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-w4p3HA0kJEN1ZgY7cWMNBwZr4UvZVEvC7La-T5JGD34.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-pbtboAXIm04olc2z9HU7leQdg5VKkeIwtvUGBbR7SYY.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1193377539281656684.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tcZjIbOez4CKAfr4K6L9GliN5uUn8idrM1kE4NEcYSE.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-zJzxw7gRFkSnH7S5MkYwmpSXrxt0ZIlcSqY3qWKlUk4.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-pb22rXFZMBaoQoUr8aPUBvZKTpqOEIrwj9SiXuKd0mc.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-WQNZJhlKO4RUUadrVpT1GOU4GA3q9FeP3yZ1PGAZhf8.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-4GQo9Byht0TPo7DEGSH995JYzUoXuZ9Nqtr6huwwxr4.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-rhck1a7X0I3HmZKAyAdf5mpkJIsqCsfG69Q2JMrm5yw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Lc46Pu7JFbPShoiPVGrAvBrOjndur8ppAQCHPRBsF8M.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-OWVBiXowbv03czyiKp-nwE5pHJgl0GvsE5j2MKM3l2c.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-8Y1yhVCAAybTMBPL3psN8NAVSVs0WexrBogZfUkmWYw.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-tFPR1h8-sRpOzDZdK1ME55kd-O6eqlGwKTkCJpuTME0.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-TyxRh4hoSym4hOsAVn1UhUaBh5Z1qFb_jKDMnHI8Kwg.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash c23fd578ef22c09ae1c7e4a0fac7f7dd4e823f540260a6886ecdb969bc9e6ea9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wj_VeO8iwJrhx-Sg-sf33U6CP1QCYKaIbs25abyebqk.pb
    Sep 30, 2020 6:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_11_45_52-1626663165689262608?project=apache-beam-testing
    Sep 30, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_11_45_52-1626663165689262608
    Sep 30, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_11_45_52-1626663165689262608
    Sep 30, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T18:45:52.229Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:01.688Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:03.963Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.005Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.078Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.154Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.201Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.225Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.257Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:05.140Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:05.222Z: Starting 5 workers in us-central1-b...
    Sep 30, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:18.873Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:32.997Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:49.944Z: Workers have started successfully.
    Sep 30, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:50.002Z: Workers have started successfully.
    Sep 30, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:47:23.138Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:47:23.554Z: Cleaning up.
    Sep 30, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:47:23.659Z: Stopping worker pool...
    Sep 30, 2020 6:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:48:13.466Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 6:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:48:13.507Z: Worker pool stopped.
    Sep 30, 2020 6:48:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_11_45_52-1626663165689262608 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3443c772-4ea8-490c-bea0-0034df0e51ae and timestamp: 2020-09-30T18:48:19.471000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.515

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:48:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 42.659 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 2s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/gj5obdeouapgw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1058

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1058/display/redirect>

Changes:


------------------------------------------
[...truncated 273.82 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Osn9U0E7G_FpFG_KwJG2OXrb-40dUaRKuKWw24XDFsc.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jtgeCNxBnSYn2hMfl7_t0DN6RFIFLdhyreSK8uxz-eE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dVvnf2CdGrD4sW66-cBlqiguhHHWpTiWvCqZZOLYF8s.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-qghle6w0vGz5GCTdFYbLE8p-CYXSltHEHLXEavG04gM.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gP38SmqAZ-wSOrjvdvORi_71KB8I3u9nGPzwEiLgrxE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-bJIEL6d7-96rN7l_PynkItbdU6a8iZT3buYAgRSIHiE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ceLoPaONeNU89DWOxAk_J2iPUoL-80VkSnj5M2w_RhM.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-qqJoGKHyGMCTScR-FkV-azj9O4cscvh5fUKCr7-hovU.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-aLzwIPTd6FwPK_pkb1Xn519rPCWFT-emSEvjieY2FLk.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-SGePdnhis0gv0mWFODvP0-lRpjARguvJ-JgwcdzqHLU.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-L7BaHrMEJH3T3nzDtlOytftwBSf4w_DPafhpqeuJCIg.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7352304566873775243.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VETHo8LH_r_9b3Ov-BsueRcmPjNKiLaiqroRi-g-I8c.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-cMO4PbWL3kdS96H6umeR8Znp_gc39WOF2Env14P7Ux8.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gP38SmqAZ-wSOrjvdvORi_71KB8I3u9nGPzwEiLgrxE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-nsInsFISszx4gzy_jt4vzrvpL-VulGGPG8BWf-aRwBg.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-hJv1Rl8GHfzI75aa2iNEiJaNUFBedMUveqQM8hl7NOI.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ZXG7ZknFg7H3VjC1AGiCXjhZuStdf1WuodKKg6Hm2pI.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Sw6lME1OrXbjJGTDsn90fAqBEeECZ5iLkF-vunhW1uw.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-bd8ciYcAr5biqhQogVCfY19fape-G_CKOhhlY8zlE4I.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-TXe1r8-CDWPgHqLw9nJQtT81zvaxY8nRY8USvkgLIZI.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-Ora8jtQMVjsUpI1we4r1Fm1LEPQlVZSqP04WYkOOCJc.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-InEnt0drvDmNfnRpXjj9oHvki1U5YtNQSwnj8UucgWo.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-guaRtWeHOZkH34mLCcSFUT_3tZ3G4IHjJVDEOj5IK0M.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-pQaqijK3NUIva_AOWDQg8F2Dg9iEKE-ZvVTNZcavg04.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-jeIApskJio32QDkYHE-eBB26N_5bMvs17KPrbbBdAdU.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-zNa9si9GHTgjNpSbKRM0RqFPEPWCTQajol-8WAd_GbE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-9r9VUvI5pF_0arYyKs0tcifrLM84pA_O7jhl9AL7_L0.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-yjNW0AZERPtQDvQtG3TiIlKXXGJRW6njE3uE0ByUl30.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-g_h9h9ICUitXAroM_qfHF7jXHM2e6PC4mRhTPv3IEMQ.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-CqY4WzofWmtHZBP4ziRnXtAMkb6cywl038yz87ds8V8.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-UNt0atMulRAKqcd98lesmjx8qRZK8Czdz51zIOLbgYA.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-qndCLwDT0dk8i9Tr4eXzaAQGkHiVyXtiJ-cKihx47T8.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 31 files newly uploaded in 0 seconds
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash e9a85cc77fd3cfa16014f47d4fa309ee95d5db3f2198db91de2a75024e2d4ab0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6ahcx3_Tz6FgFPR9T6MJ7pXV2z8hmNuR3ip1Ak4tSrA.pb
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_05_45_36-14555014956868873490?project=apache-beam-testing
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_05_45_36-14555014956868873490
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_05_45_36-14555014956868873490
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T12:45:36.563Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:44.118Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:44.915Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:44.963Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.031Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.114Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.160Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.192Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.213Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.615Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.691Z: Starting 5 workers in us-central1-c...
    Sep 30, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:10.762Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:16.662Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:34.437Z: Workers have started successfully.
    Sep 30, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:34.472Z: Workers have started successfully.
    Sep 30, 2020 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:47:14.432Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:47:14.570Z: Cleaning up.
    Sep 30, 2020 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:47:14.659Z: Stopping worker pool...
    Sep 30, 2020 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:48:08.395Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:48:08.442Z: Worker pool stopped.
    Sep 30, 2020 12:48:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_05_45_36-14555014956868873490 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): af10f2c1-b72f-4933-97fd-5635d0562d44 and timestamp: 2020-09-30T12:48:16.002000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    25.416

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:48:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 52.673 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/6qo7g2vl424by

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1057

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1057/display/redirect>

Changes:


------------------------------------------
[...truncated 275.46 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G4j8L1hb91zJA1mODNyIJol5fO6OIdYLWLRR93RQnw4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-jcZL2HdwHa39fp2LtQMINfq_MP7ZoihtWLmGW6UCkk4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Vugetv-LW7wLQhjQIt5uzyYEvFrkECcW0o9TAHkArmw.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-EhVtn4GBkm6vwVm78YKofJ-hECnW1MUrKWfJIEDhFTI.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G4j8L1hb91zJA1mODNyIJol5fO6OIdYLWLRR93RQnw4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Vfi63ygK9r6gjlR8pinBccbIydRLlZSzigqHEnwRn4U.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-SQC5FFR6IhdmIBCkfbnu9ja7OcpkrCZ5oCjcYaH195s.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-mQ2bwJLwbjR_F-JAB9496a5pPKN8B1rGGkmEwgrUNww.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-yfXUCSr-U2T3TxwEkG2T6aLpw-VzOdQR6Yc2zaOXQ8o.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5277594357761847256.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-coLSbNUQHsphxMiB7s8x57I_BuJYRmuVrqzL_HuDZU0.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Cau_0jvmuTUTvFjkrVDRiIO8sAObOZQyMhfOXajVjx0.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-w0K1qSf0WkgkybYy3E1qmMS1kON8pXOcMemimnUxQkc.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-PqZNPg5uJU0id6FPUW2riCji34UC4VGuOM0ysLVzpUk.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-mropw2GNTyStjUwAjw4Y70Fahzq47r4Oc3EydgTAonc.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ZVUl8WA32vYVryLN1fp_mjGxa5eFrkTcMRlkFMpG6f0.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-2feGg9H3QA42MJp4ERRzGRcMQ37XWgmLRUImT_OxRLU.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-mH96JueqGDzrNzckvGbb4C9YcxMIaquQQb25HBS1EjE.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-biyyGhK0X5OYc6PQ_3dIW19PhJujBptRx4G-gPKirQ4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-HzDgdbuSyccxC4lYPb-MzV5gsdpSWIQhTBGjr_XkuvU.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-xCFZoD9j9zNo8qhogrdZlzGVf6ui_Iccb5TaWVC5zAU.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-MFwBj_xuN0F1ar-WyHb9pgXtsOM9_HD_2ilxKRVs8J8.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-pp2rO-BaxXjV2D5Dnt9MF1mg7Hm6nSj3MLUvkgx2ACs.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-q7NZSFP7UBqAhHtCjsKLKoO1QEBu_6wGrVLEdzyovRM.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-js9TxKIKFNE0XR5bH07zAjIogHEaahMbpmmkHYb7chE.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tmj29aGgloqT6ENN0D92aLCO1-vz5x5vdf4yEp7VlzE.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-EzMJIhGCizO1xiCBV7L5lUetQ1hAyWB8nNGDZBalq2Q.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-72UJCxu8IKYv4skYSKR81x0WMVDcifNjpgGYg-TBAJc.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash b6d7dcd795139ca23d5ed43f6cb949440a8c4eda6c19702ca47ae7d5e55777b7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ttfc15UTnKI9XtQ_bLlJRAqMTtpsGXAspHrn1eVXd7c.pb
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_23_45_38-7878334072495782317?project=apache-beam-testing
    Sep 30, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_23_45_38-7878334072495782317
    Sep 30, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_23_45_38-7878334072495782317
    Sep 30, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T06:45:38.559Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.137Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.716Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.790Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.822Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.900Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.936Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.963Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.998Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:48.817Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:48.904Z: Starting 5 workers in us-central1-c...
    Sep 30, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:00.427Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:17.154Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:40.481Z: Workers have started successfully.
    Sep 30, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:40.530Z: Workers have started successfully.
    Sep 30, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:47:12.583Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:47:12.767Z: Cleaning up.
    Sep 30, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:47:12.868Z: Stopping worker pool...
    Sep 30, 2020 6:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:48:03.659Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 6:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:48:03.702Z: Worker pool stopped.
    Sep 30, 2020 6:48:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_23_45_38-7878334072495782317 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f4de1de7-cde3-4fdf-aae9-7d3b27a534db and timestamp: 2020-09-30T06:48:09.228000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.463

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:48:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 44.669 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/j5ec3fcv3funy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1056

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1056/display/redirect?page=changes>

Changes:

[tysonjh] Fix Java nightly snapshot build.

[sychen] Fix failed GroupIntoBatchesOverrideTest

[noreply] [BEAM-8024] Add JPMS E2E test (#12899)

[srohde] Adds duration string to the ib.show and ib.collect

[Kyle Weaver] [BEAM-10953] Add errorprone-slf4j plugin.

[Kyle Weaver] [BEAM-10953] Fix errorprone-slf4j errors.

[noreply] [BEAM-8106] Add version to java container image name (#12505)

[Robert Bradshaw] Remove experimental declarations from fileio.

[noreply] [BEAM-10677] Fix @SchemaFieldName in AutoValueSchema (#12520)


------------------------------------------
[...truncated 278.59 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Qu8vNawVK1SacREE87sOYuQG-VMa6pqiWnQyUtAoLYI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-oOZ_8sfWmEHZO0FccBW6ePwcJ2n1dEKlX0XCl4rnUZE.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-BNDklvMapjBbD1Mz-ouHcOAxm9oqiUuwVVcPYY9VuZg.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-vt7ywL-YEC_Z-1yqK05WtSNwPy6tQt21lp7p_X_YF4A.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-XghNs7OYlKgXIiDC9cPMjrde5TaX7gFoApD9ip5gGR8.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-K-66a0DFwLWc7Q--DpfoQtH6rxhDBgaFGq0hEYJhDyU.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-m24-cC-Z3n-4TQyNwT2YKK_PDnG3tyaQpYFqtqDXddc.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-EvZhiYneNWZmZ1XVMdSpPQgHVptkEqiwYBSCFeDiWuA.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-6RBkjz7sTk67gWHa_7jyDEfPPpnwatps_BNqKbkY9mg.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9aVu72ZiAptuRRhcpvbcdPkK7J2Qx2bQ9-Onx7SfWtQ.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-HRZg9ecIIrCL4-u_FVFzZp742QT0yr3BfAKOeTMyO-w.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6292497781437164006.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qnUM1mwqgdRIgcKlMV6VXfbaMfIDwFaTxl-GmdimXdI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-TX81Etxzk719Cz1t7XXG7WfGYocRQHjtjgl8hBdhjCk.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7HgJaDaoz44SEq2GJ1kjWjB4OO6vzYfrB7FrF68n714.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-1H51Tg72MKw_X4lqBfFQU_ReNkCXxP_x8M2wRzEMPjc.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-D2H55aQIrHNV1EdcrTpThaVT64McxzKjcdjwpHJwboM.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-NcAXLrybYq1pbnFGueHRJiWK42D_T9UpGThqDcV9tyM.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-jetHsJJcIbaD9CMvNQdCZzShbgd8Eg1YijhsQn2fXy8.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-yosL-3tHj0SNwRNYqpw1b5tDEG32IRape7wrmCotmDo.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-rMjurNqj5nWWoMKjVHwBrgfgUFLrhJi3oUTYpOz7-Js.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-qIF4IGg80jc-B0cTSxh-sSQ3QQhrGfEc6nNAwKIis5Y.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-6cbTAM3oRELUmfyh9sCVbcC0GdgE5aLh6DlOh53mEWw.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ANYYBfGUjwX8kHZ88hrDvmv2vLE8_VnoK4INOvq2JKI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-k39F7YT67oUCMd4XjRv-cG45olhzULxXrCY0XU8-ccI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-WQb4ac3k_wGGHgZgo3ZzVqJMWTQTOJN11ZM5GNrUgdI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Qu8vNawVK1SacREE87sOYuQG-VMa6pqiWnQyUtAoLYI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-oJFbHDiP91yfDq62VgzGbsbPsOxMRw4scYq6rXQaj4o.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_eeyPBZ0DRGgTSIVJN4MYUwjMO0v6O72iJ0DVVk8vzo.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-zbvSYDhu2VimlJElH88VG4Isib81Ui5poftcbQ5kuBI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT--XdPkjONBMm3KjXDOzZflr4WFU8aXS7ZUY2x8qWg6x4.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tGPHqSSQDkwPtP0cPhZpH2G-W4WR3n-xL6k4z30B33I.jar
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 39fdeba08e515c5a74400bda31396ac7b6652d57167aff6c8c91ab7d94c1b044> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Of3roI5RXFp0QAvaMTlqx7ZlLVcWev9sjJGrfZTBsEQ.pb
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_17_46_26-9821662910935396458?project=apache-beam-testing
    Sep 30, 2020 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_17_46_26-9821662910935396458
    Sep 30, 2020 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_17_46_26-9821662910935396458
    Sep 30, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T00:46:26.813Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:33.988Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 30, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:34.852Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.062Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.111Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.284Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.342Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.397Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.442Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.980Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:36.096Z: Starting 5 workers in us-central1-c...
    Sep 30, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:41.871Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:00.044Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:20.127Z: Workers have started successfully.
    Sep 30, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:20.187Z: Workers have started successfully.
    Sep 30, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:51.469Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:51.734Z: Cleaning up.
    Sep 30, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:51.849Z: Stopping worker pool...
    Sep 30, 2020 12:48:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:48:45.546Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 12:48:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:48:45.607Z: Worker pool stopped.
    Sep 30, 2020 12:48:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_17_46_26-9821662910935396458 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a8225d87-51e6-45c5-87c4-34811da21345 and timestamp: 2020-09-30T00:48:51.895000000Z:
                     Metric:                    Value:
                   read_time                    11.559
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:48:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 41.461 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 36s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/nmz3kjxmi67j4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1055

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1055/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10971] Redirect contribute/design-documents to cwiki.


------------------------------------------
[...truncated 270.79 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-S06h9KYUxdf1fBF-19y1OfwUSev7HERLGmwuVsIjbME.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Xkcrv9cnsd090y7b4cgb8Z7yqqW3WsAy1gO_cP0r6Fc.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test332511998639500270.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5le9ISdClL_9yVLI-GoVl4Qzx7064AYwA4VyiySJcuM.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-BJ0GHtunGji1mEOXLCbZKaNudgnSat-eXAi_IYei5xk.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-gL2nmzax8WxxRbP-fmGjiwhsyp3Cw7na7EzXclkF4gw.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-EB4gUQDR4KO-ZgQbIA5RFC-C73Y7gZakO98zqs60A1o.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-J4TjeFovjl2Ab978ZAC46jg0YoJm24VzU8q_Pzxk13Q.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-kVAUSxSXkZ8bvVE48rsfNC3uDu1MkjETCDmkldtFhso.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-of0WpIk9ST5x0cIduRBg0zJiw7qN8ak6Vtpn4aBE6Fw.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-ZbVdrMdp8zRcOyyeQXxl2CWiv8QQtGdY9m1P6EG-eho.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-DHGO1exNNd6mjpuglbGrAMKaw9np9zc0_6Uqf2OBWSk.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-gdyOSs9_HxDNDTcB2S3BKTx0jCJ-aektLvDsnxeSDMA.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-7IkCxE_rNGqgjHIqMGmt9kO4d0whFw_qwIrJdonrCuM.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-n7zonTzXVoDqW7fU83x2w9rFdRIoyHhgNcrKw6YIcvg.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-OzI3gMzq5HkDmnBOFLIh6fadIMU71HvX_vtS1JeeBgw.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0aMNet4pr_9nlByVkEhCMsWe1C0cZ_2ZwJbg8_NwB0A.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-7Mw0MvFBw1Fn5CYUjbZIByP0F2jUFXjaQqTNV9gSgnM.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-S06h9KYUxdf1fBF-19y1OfwUSev7HERLGmwuVsIjbME.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Y9y_mYphv9cXAo_VJaeDMnedmUgA6TVbcjjYdNgoUvk.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-KD5akkfEmu0R2Akxdpy8EDnDx0Sw8fInvHLe56iITnY.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7YOGY64KozuqBQO7t5CE4WbakEn2IKJZmCkK_bxY2NQ.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-VWLN9C1vbKwEYDDqhUJRj2pJ5pGMGZCO0UYKzf2eu3k.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-K5hy_lt1m0EdRSFutNxACRSBJH4H9xMzbCKLZZqsehc.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-C7opOM-V1ki1TjwOQrCFMQtxkdd9SfGFmrpByy76uLA.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-h9170ik7GzOUhQQAOWLBi23e0zlkl2eTz8xLIR0IlbY.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-CcGVeLgvGN_JMHS2rezxqFaW9sADJXjWv001fcWNfV8.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-w1m__QuXuBNxtNkh-QvirI04GV0qo3g0Fyuz_CIjAIo.jar
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95915 bytes, hash cda34bb05d98b951918a1c71e2688661f852c5eb92ab1dbaada324599f39b36f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zaNLsF2YuVGRihxx4miGYfhSxeuSqx26raMkWZ85s28.pb
    Sep 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_11_45_21-8437957726531856427?project=apache-beam-testing
    Sep 29, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_11_45_21-8437957726531856427
    Sep 29, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_11_45_21-8437957726531856427
    Sep 29, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T18:45:21.344Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:28.467Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.156Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.196Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.225Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.308Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.343Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.377Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.401Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.906Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.973Z: Starting 5 workers in us-central1-b...
    Sep 29, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:54.824Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:54.861Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 29, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:00.301Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:02.138Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:18.204Z: Workers have started successfully.
    Sep 29, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:18.237Z: Workers have started successfully.
    Sep 29, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:57.711Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:57.843Z: Cleaning up.
    Sep 29, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:57.909Z: Stopping worker pool...
    Sep 29, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:47:48.477Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:47:48.515Z: Worker pool stopped.
    Sep 29, 2020 6:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_11_45_21-8437957726531856427 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3bee6c0-1ca6-4377-9520-e00d88bb38fe and timestamp: 2020-09-29T18:47:53.732000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.874

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:47:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 45.778 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/2wydiepvvmius

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1054

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1054/display/redirect>

Changes:


------------------------------------------
[...truncated 268.81 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gSfVaW0uPKTdpAiv8pU8usJukjI-9DRm4pYNRsp1pvA.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-XumsOKQ1iKrwCkKKPrLUGW1L1NDhzfuOwwTXp7RW5QU.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gSfVaW0uPKTdpAiv8pU8usJukjI-9DRm4pYNRsp1pvA.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7113768243728609945.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-jUAeoP6DcJlRxqpqajXyYmLyWRDQFvsbZaBd0u3gV60.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-vAXAwz3m_1I6B8wc4uTG_j0n8ZoLXC41QW9d4UPtqEM.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-w5qpeXjdsrFueQgcGvI1xHIj5xgEe0r_eMP2JYm9_IM.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-t_frMk49fHKFdpZZA3RXPwklx1M_BZgSydjUUfmS9cs.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-cwrTo44YokN6qHhwYLpsq987rYULTGQpvN6R2sy9-N8.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-S2h9G-Yjiqd4nVYTNGRWvFBXiQ3cwRFrclPan1qDjOw.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-L5L_2b5boa68rhBFWkheb3IbJsKzv-WiycLyXfSFGuY.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-zBTJEmwi--H4O2OJm_BeT_VxK_NbMAlsh34cPR7nE30.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-3fCvDoowKklTFZAbqc0N_bS2WIUJ5tb8IHurMJMR1H8.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-meTkSMOzYwwz-W7zjn1o4c7g3Cz_N0fy-qxdDPtxsBY.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-BtS2mh3OJlikPYOSB58lIDxlNIoVMuLe5icq8_OWdcM.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-NrTCTJ63l0BrlrdSlEVOBRQ4PqBpY7QLDv_WfL7_0NA.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-zM1DuKqRCjEu4wcvZodUEnWkkfuDLqx31L9dfhOdtNs.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-3XItXmIs6XoUcZ5mOUb3_JWV2h-AtjzNadRoV5GWCRU.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Lsr-ialuUw1pXogNLc0U187Klyf6dFFKUjjFaiOgrWw.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-M0Oo2APLsE0O7WE91_4uZydk2rot8kxHGas9qnNeJ6Y.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-UWXmskVUU6rRQHretEvvbPC5CziDSsQMHsO7fGZXQWc.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Fl0HrfI1huah3yWcZznc9tuK3Bpu69fCuD7986EOJFU.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-zl6J62pe88KSB0kTGjEQA71KOToavRA4397tiPpCK90.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Ro2Oyapss4EoR6e0kjVFi-WDfsDrcm9JDjpExsAxgOg.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-IhSbZ5fO0K43TgFuJjy_gh2WigqrX1im22VcDIhsce8.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-SxGrUaoBHFThaxRCUEh4SoGc4i6XrDC1TzV9tjFG3zM.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-iPc0b8Zsff2rN8tNs_p9kOxHiVFf-0FN8F6pSZVn4LM.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-woVqDJNoBq7SRm9lNN_KJeamXL-PoG2iDaMTjKUDSic.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 546816d0bb16fc952c44ccf73672e1042b655aee1de86fb98f321d8920928b03> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VGgW0LsW_JUsRMz3NnLhBCtlWu4d6G-5jzIdiSCSiwM.pb
    Sep 29, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_05_45_22-2927391320876390192?project=apache-beam-testing
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_05_45_22-2927391320876390192
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_05_45_22-2927391320876390192
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T12:45:22.298Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:29.714Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.595Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.638Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.737Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.860Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.893Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.941Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.977Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:31.456Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:31.533Z: Starting 5 workers in us-central1-f...
    Sep 29, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:56.992Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:57.865Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:17.464Z: Workers have started successfully.
    Sep 29, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:17.501Z: Workers have started successfully.
    Sep 29, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:50.837Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:51.018Z: Cleaning up.
    Sep 29, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:51.105Z: Stopping worker pool...
    Sep 29, 2020 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:47:37.021Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:47:37.076Z: Worker pool stopped.
    Sep 29, 2020 12:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_05_45_22-2927391320876390192 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0b148667-cd7a-4d60-8394-344020d3fe81 and timestamp: 2020-09-29T12:47:43.779000000Z:
                     Metric:                    Value:
                   read_time                    13.228
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:47:44 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 35.529 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/frfsy4o66wkzu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1053

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1053/display/redirect>

Changes:


------------------------------------------
[...truncated 271.55 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LwpkD_TEfl6ip8ABur9nemM1xuVKbkLEJ4uupnipl7A.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-yHHNbH9WRbBUTqY2tHzZPy-3k0RmM8vS5Zk-o8-KLDw.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-lLEklKVhQILrE_8Pof6sNgxYVw_2O77CYpgp2uC5MrE.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1987679953031993710.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-o2rCEXOrEc-qVK5k3IjBO6s7neUa2_pGbb-W0q8U9vU.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-keoPftgi8Ob1SHABMU5kd0E4YSMYeSDZWnl7K4CeVuM.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Kt2GbMYN5cysbk84W8vU8MmmmrRPshSA72EOoao3Bpg.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-wJD-oiS_y4Ue5GTNKWb4endz5Wj6scruUk8syFQ-yCI.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-jR_ACGbzRXCAJoyYI9L5kFvokmILQ6BtREbTqKxvmjs.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ify5JnA0PGnL72g4tFsu4E40QJjKS5W9xEdS7tQ99ls.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-I63IGrWs2dF3HhNMASUswU_-_z7PSZbl6AvUbCjOH2Y.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-mbFlByXtlNpnpYTFJ6snJkSipY7J-2dN-tfN1Bm1XsQ.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-whtOcV1BsL99mTayMnd_m31oPDE5AEHgYycrRBcrYtE.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Cg70QzmmI7Q0Lp6T1HGmlOtIwLPkmYrWVJApEJRpOsM.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-d9OWel8tQ2u5H0GJOd-C7F9xjr1N1RTWkYqeqRC6Dlc.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-nQLlWGsLe0JxUzyZpol5N_JBVhLfVM5BpSQ_F8UhwSA.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-CdawxFGUYm7vKUbXYqKR_WZFF-n8a_U39XDSWATl8U4.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LwpkD_TEfl6ip8ABur9nemM1xuVKbkLEJ4uupnipl7A.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-7EtcMgsVwna3S083GlC2qRnoYxVa9GGJnCpGVnh8tjw.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-QQbs2XX544yCZ5jO_is5kXdX5xb0Bk_0BmccsE5mUOk.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-UltHgTB0AnmVme3OTwVo7GccEBpYeK0JmOGotp6G2j4.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-HEpDvKfWTtXRba6VfMrIPG0rWd2spy8ieuQNotcCW-8.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-K-18gMb8RqFMOzer8rOHlnIa49lUv-ZQ_dhQ7NGy_4Q.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-v-pcACkitNXmFOnaj7YoQVMBO8l3eL5zJXl6GMwA65Y.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ZdLWmuv5tNglwNb1rXU3-vRdx_edDjLEHZsMpEQL1us.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-VW33o_Im8QYBOHZWat5ZtbuVdWx3oBKNx5tAKR6F_p8.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-3GCmB5o7COCgbC1sMuqdO41unzg-8Ov9dYL8heKkKkk.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Do7qUeK3LfZ5kh6q501AKqGRys0YYJsrRA_gh-8bWNs.jar
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95917 bytes, hash f4284a3b4b874c004d3aa40e6621e8fc24102288f1a40e234190c13566a85e96> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9ChKO0uHTABNOqQOZiHo_CQQIojxpA4jQZDBNWaoXpY.pb
    Sep 29, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_23_45_20-11414632088283701576?project=apache-beam-testing
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_23_45_20-11414632088283701576
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_23_45_20-11414632088283701576
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T06:45:20.234Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:27.284Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.017Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.141Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.172Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.248Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.286Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.322Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.347Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.949Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:30.053Z: Starting 5 workers in us-central1-b...
    Sep 29, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:40.652Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:56.782Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:13.108Z: Workers have started successfully.
    Sep 29, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:13.146Z: Workers have started successfully.
    Sep 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:44.324Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:44.470Z: Cleaning up.
    Sep 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:44.538Z: Stopping worker pool...
    Sep 29, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:47:36.073Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:47:36.111Z: Worker pool stopped.
    Sep 29, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_23_45_20-11414632088283701576 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3f05d1e-f435-4bc9-b303-c153dcf6d827 and timestamp: 2020-09-29T06:47:42.389000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.624

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:47:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 35.218 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 25s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/kmkr2so27ntmk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1052

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1052/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-10978] Fix bug with map type inference.

[noreply] [BEAM-10882]  Update Snowflake docs (#12823)


------------------------------------------
[...truncated 270.41 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sZf0FsJQdGsa1iI9LYD83s32hZYZfG3VTpAZoTHkO3c.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-70Psw04o0PeDVMYRp9-2nWhBMGwhn0GFIWWgVDBpWeQ.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-2d6RrgF3AMWSWApUdyd04ju-uorJy8tRbLFUjDN8OXg.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-TAkjyoZJE_e1Hw8QkGM5u-98yOvbsKMrTnfEY1EWUIo.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-9g8AGXyPesWBY2RWGpZd_rPHswEjOA2XwmeY9r1oBOA.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-EUwi8SK1wbX1y1-r8r0L5vVxYewxELI7XsKRJq2xg7Y.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9Lde-q-lgGXqYVZ5VebvqkoiwkAZ9rzLI_sb-wd4QpY.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-nq4ifGCLejmAmDT03shQ_4Vt1GblRZ0Fhb-r0xt-tqA.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Cywi0Pl1nZG4SXV8lf9AFW0LLq4lVpCUS0i_uYmBTBI.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-MM3rHTVCW1jLKQFdBK1Ag6RGczBdNxVJC-tCayFH6IM.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Ka-peIj93YT-Qj7csbQy_FBJNfcEj-ONFepx9qwzzX0.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-3f1XD0i9kVRuBPgYHAtM_UCfhuqwYHIMb5Cktw2baTU.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-qAHRfbrTELZl6lLVkcb3zKxMidZh11F1lLH93chyjOw.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-pCv53ldZfgkb37w9Gpp2CU3OaES0-huO9E97kS1PzHU.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-klh3rMtvqKmAFMlHaK89G8RUe6pI_--e5_UYCrnv6MM.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-LVBhpBFwupKsBDON12HESob04QGDaGXmI4tAkHhHOl0.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-vGDi7mtL3lx6Vkb9GPsENRtuWI1_O8KVQ4qew63nsow.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tC-XiFbtW2c9eN2xTUu1ZMmET4oAMArMoS6KcYT3is4.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-qxrEf9-uitO7NJNe5vnWbiXHwRGkYI4R0dNj-0W4nOk.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-pGsMFkwKCrrFjyPRxJtjD4eDdMWjaMN0mZdfAzcZv0k.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-obVEL_pSZomw85YzQ3sgvXyFPO4voMRh5tp8kZoQfpo.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3239215713181194417.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zOMRRZXuOVLbquHT0Wp5rEQLoVmOE_xyRKbKInJnMr0.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-JLGsEAF3W5eK3Tmxp6sbo16I7AtZeZZPYyoMqSWULac.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sZf0FsJQdGsa1iI9LYD83s32hZYZfG3VTpAZoTHkO3c.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-lK6TgY5WMARqb8dxxBe-CW2fnp5LqAgu5t-VqZ3toFA.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NTPf8SdGnAoEUwg_59vKjj-q2VPstb64hiQoBO2RzsE.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-6Hu1X5bHHa2v2SWKr56itgUwYpBvdoCrDTiWwtFqDcc.jar
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 608bde57cd5690610644171f2375cae7cd994af41d195e27332d5844173b0eb7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YIveV81WkGEGRBcfI3XK582ZSvQdGV4nMy1YRBc7Drc.pb
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_17_45_20-13298363548966251720?project=apache-beam-testing
    Sep 29, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_17_45_20-13298363548966251720
    Sep 29, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_17_45_20-13298363548966251720
    Sep 29, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T00:45:20.789Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:28.870Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.595Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.667Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.710Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.790Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.854Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.898Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.960Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:30.370Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:30.466Z: Starting 5 workers in us-central1-c...
    Sep 29, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:46.978Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:54.462Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:14.603Z: Workers have started successfully.
    Sep 29, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:14.641Z: Workers have started successfully.
    Sep 29, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:48.914Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:49.085Z: Cleaning up.
    Sep 29, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:49.177Z: Stopping worker pool...
    Sep 29, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:47:40.648Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:47:40.706Z: Worker pool stopped.
    Sep 29, 2020 12:47:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_17_45_20-13298363548966251720 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 729239e1-bd09-4e15-ad15-4880c1ad8f1a and timestamp: 2020-09-29T00:47:46.172000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.705

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:47:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 38.561 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/my42z35qtzvyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1051

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1051/display/redirect?page=changes>

Changes:

[noreply] Update test_get_python_sdk_name to supported python version (#12950)


------------------------------------------
[...truncated 271.51 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-h3jjyUjFnRAN2R9XUn9Y8jfdZ5hu_K1BGJXc00h89mA.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ucOLTYfKO64YPc8GZi1p3pNZis89otR9OBJS3G4G2GM.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-5FssLmcgdBkSsr4gNW9atsY5a1ZEcsLTZC4-tYwMdQY.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4559249131501615629.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kxaRdDrOFdJVnv_BNTCcoxv5Lg26HebFfql4QfYjhiI.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-F65FPApNG6lt6Qw5X02wP8eUccuUtVQMPZq4ov02lCI.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-jRyNWOYfuH7fUi-yJp3kgXyFgw2rnEPM9rEcmAbz5Fw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-u1w6A0MG0zX7CqeZr9IVBFtgFdX_Zm6Sa6I5PaCszuo.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-yAfV6aOfzdeZlBDWlmJbKH8LyevpPfUfud5W9V_2O4M.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-UkCmVgfiT4ctRJtzZaWhQz7xS8RoFYhq5weW-HvUsfI.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-io4_DjOhu9L9EQncIb3jbT7OIk_z26OiDSzWvk6m8Rw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Qf2V6ojWm6zzPBcPg5WZE2QPMANnjM1BB0ENuhx_pu8.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-OfRaDGPr0vybbYTaVlqv3-eS9cGIwj1yKgRy89nKIZo.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-RCMnZOdVB9SFdJZ6PBylAstJ32cvJBuIRSpzFXNYhAE.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ipWdHXqYznsChNpHM6RrHW-_z4vwIf4O28RAeAWEQic.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-HKdipn-ott8ce9DWHzwMpliyjmxier4A63ByFUXkuiE.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-l_79uoMT_nZ9pyi0hZO7NoxGYePua7PiivcMP73BVyM.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-0gh9olCmYnlfvluCSZE81tqnj7RIGtVmZeq12-mGMVQ.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-qadZA8zVeriQodbJf9ZKLLUEa3EDHCPnxyM5_AYJYOw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-oiHtovHIUho2HBjAoePXdjiuxOVyRSEuueIjJ-RMc3E.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hWKQyCmMQPP_BYCIN59vz-JfcREQRto8Pil9IXdFKuE.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-VCoWym0pIu2B8K1no7J5hkni7CxUFXEBw_MppxwadOw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-odwr3JhhtQ9OQKZDuB_btw9sXznq2WvnvnAtrl3GgNw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-f3CRY6LzfxkIj1QBdDJHdiKvaPI4aE66XeDwpZA4w0Y.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-YXdQS8iaPDoKu6Pbu7RtEZUabXdcPT-ZPtFXahc_dzo.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-oDM1nuirkreipaNml4WpLFGqP6TsD2JYOMaBYX1Q7ns.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-h3jjyUjFnRAN2R9XUn9Y8jfdZ5hu_K1BGJXc00h89mA.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-8BBNjHE9t0a69IWa2gERcLhTDpWR4kgBIvmvuWQ-deY.jar
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 75e897fe9eee631182ec22ec4da96e78d91a466c249b631b16a1bfa73adda0bf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-deiX_p7uYxGC7CLsTalueNkaRmwkm2MbFqG_pzrdoL8.pb
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_11_45_24-15917909757904601353?project=apache-beam-testing
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_11_45_24-15917909757904601353
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_11_45_24-15917909757904601353
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T18:45:24.420Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:30.947Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.755Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.798Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.822Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.889Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.917Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.950Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.980Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:32.436Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:32.498Z: Starting 5 workers in us-central1-c...
    Sep 28, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:42.141Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:55.833Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:55.861Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:01.130Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:21.321Z: Workers have started successfully.
    Sep 28, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:21.370Z: Workers have started successfully.
    Sep 28, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:54.295Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:54.444Z: Cleaning up.
    Sep 28, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:54.508Z: Stopping worker pool...
    Sep 28, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:47:45.709Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:47:45.752Z: Worker pool stopped.
    Sep 28, 2020 6:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_11_45_24-15917909757904601353 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 428d30a9-5a80-42ab-bb88-e7b1021591b1 and timestamp: 2020-09-28T18:47:52.646000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.694

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.23 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 35s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/2kxkylrxuobmm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1050

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1050/display/redirect?page=changes>

Changes:

[nosacky] Fix up-to-date checking mechanism on Tox Task.

[noreply] [BEAM-10481] Ensure registration of the accumulator occurs. (#12850)


------------------------------------------
[...truncated 279.78 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-9uVLE6MNxJh5fq9lkd0JHnb2_ngB7NufXk3fzatLX14.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-dEFwGkkr1HKF3rmvwUjWAyrhxuvetft8fnElrI2p28U.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-HnI-IAKH7z8XJyYaJPLpznEhPZ3Yj2BVfh0g5BsXcl4.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1gNkBYes3uXTam9-oHKTqsOU2ZgWJN_YgIUgUCKJlHY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-MqJyUYJKWMMEPFij-vZfLhgy3wj3UoroMHeH87lvIaI.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-C3o3dR9T94TTO37TCqO0Ue-gT2dVhkrXgfKoyg5XwTA.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-e4LMjRBBpzNWxdx-p-Je8qmHtIY26buWId-gKgRr2HA.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-VgqH2C-zTYZ1Io76rZEDe8piQ4KbyoYx6teh3xxh_CY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-nttlflT5-prteLwhDGu_odRwuI81J2FDKNgHX8Xn-m0.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-h4LublY0qLDsJrZ2XERkFOj1TGxGyKk6ddQeSvVRzTM.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-KBmjD_4P2BNg-4Efv1uWlvc-Guwo3eZoMYoZ8tKtEIU.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-PL7oE8EkImnsPzq07ftQLAgwz7FwuOckNOfdNzchIs4.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-b7G0-6PUd_NUcz7yjNDH552KLvO03srWcZSvHThq7pM.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hL44GhQg6xhkF6s9l5axN2QWlcOHeLCRoKMyO3_W5r0.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-XzIs0uEZs8iItazf0SQ22cPG7gaAt5tLb5FT4rinwZc.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1500760170008985425.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SIcGP0cEjcvxrkFjVm8-iOsk7xf5BW--FnPcYZM1Kxk.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-aucEzhzdgjgKUZspIBmyE-pVDqpGePMwnmX90nyn0I0.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pfRnmDokLNWhPa0Y6ViKnCJXQyESBdYzdNjJUQTwyuQ.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-J68IOU5a9KC-wXgajmF1jnYTq3JRbH7n86x-2wr7ZxY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-W6PgZnDBeyqKTokPSjnN95v2v1MwtwFB0WygNqr3x-c.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-hiAqumh1xg9BgGNmGR3_WbDF4IeFx1M41fH0OEE8NnY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-MqJyUYJKWMMEPFij-vZfLhgy3wj3UoroMHeH87lvIaI.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-07FuoKvRQI-9R_fvdE5Hq3xYDA21DWkZQyQvNvDpvRI.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-U8Yvyacjb5mN7dhO4rQBNCW-60R9_haaILrExzWcPX8.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-IF4j2HinKmX8GH6IhSs4s6dA9EFTyBK0aRg_pCIJE9A.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Qwz0BV9tJpIeBdV1UiqP0fzP8pnHsW7tzUCKVDp4EsQ.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-2BLzCwrfKQe_ArjNTdvx6s9TGOVoaB4W1ZBECMcqrqw.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 27 files newly uploaded in 0 seconds
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 8286ed8db4b3add32fd3f90473807d16b82dfa103233dabc98624b04022f6ad0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gobtjbSzrdMv0_kEc4B9Frgt-hAyM9q8mGJLBAIvatA.pb
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_05_45_43-7419717706125343924?project=apache-beam-testing
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_05_45_43-7419717706125343924
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_05_45_43-7419717706125343924
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T12:45:43.843Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.057Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.663Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.701Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.730Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.788Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.816Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.838Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.859Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:53.338Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:53.407Z: Starting 5 workers in us-central1-f...
    Sep 28, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:17.609Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:25.615Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:43.649Z: Workers have started successfully.
    Sep 28, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:43.687Z: Workers have started successfully.
    Sep 28, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:47:12.105Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:47:12.237Z: Cleaning up.
    Sep 28, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:47:12.311Z: Stopping worker pool...
    Sep 28, 2020 12:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:48:12.863Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 12:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:48:12.903Z: Worker pool stopped.
    Sep 28, 2020 12:48:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_05_45_43-7419717706125343924 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f8567e64-1e2b-4dab-b4cd-b7cf06f0c9aa and timestamp: 2020-09-28T12:48:17.883000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.487

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:48:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 47.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 68 executed, 39 from cache

Publishing build scan...
https://gradle.com/s/cc6k575wbb4bk

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1049

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1049/display/redirect>

Changes:


------------------------------------------
[...truncated 271.71 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-_bxHSd15fEEL940JIkB0R_PG9UVduLNAI7rUSi7jON8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6916825691573026148.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ATK3QZner8TmMVhGeaTawCG7hoP8QnPfXDwcikqhxz8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-E8Iw-me3MKPVz5odWLPBhtnwYVvIkc7wYmlcCnV-XUY.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ZYddUcNzp7kMNGIHJrNFGyE-AYqaupuKJPVSfxCkw28.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-GMrgPq07l7EWGc4vDEtNnPOcLd1c0-dIAwd693Ablo8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Q8T3Ywk4dXvMTZbOII9GPs0bom0gF76dyBmlelTUjOM.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-afYg-VF8-rSOPiKn4r17Yicfm88qiqH87hypmLLdVV0.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-1daSSqzz_OVUbAr3RVIrxi48wUu1BC760_U6NYrIelg.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-42DhFcnXXs9VQJQuTxDgl7W8uhUp2Z1KTzSZuRoYSuQ.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-fxQhQ6wnqk2zJfihNsvPgA095qdq1n1accJi8LAczdI.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NxPiyjzSppZF1bWYHQzdzAh4BOM5gFz0BkXnItINZu8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-vkKj3kXvYG8Xz5ZEkZUoorAv-A2tQmmnNaHocTrmRlU.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-yXcEzd9vwxTDgE1lJm8sQDpqZjolHqKkJ1lieZt8fAw.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-4bspIpGBPbM7dl05FQS8QpSQkjWo9OgRR61zlJ082rQ.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-BXpIL-exVxx1QAUWbrCWOOSlZ2gc_oAeOisY3Dh57gc.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-vkKj3kXvYG8Xz5ZEkZUoorAv-A2tQmmnNaHocTrmRlU.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-9WWeGTxRGQ7c04oGEPG6yV3uC7WMZrQ0yKeKHqGeZYM.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-W462tH2VYXk8XnnrqLZuvy2UyNeGv4YTMIZFcnCBb1k.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-aDqRfbgnzXueFsHvua3rpXM_6rgoDdl4sytt4jZl-_A.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-AW525wwAo4CE5gcToM6Ant7_lT1D3n8FsJXQRapBGe8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-V7NBd4LZpEYWTe7YlMBnL8D85M2FNedKYXU7VIR0oUs.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ICWZTiQGqjuQwR0ZG72Pas8gEYE_GcaJsFjyVehUuRw.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-wS5giF61VeQN3IaBgx5_NS5gskenhFcx5cq9gUY_fA4.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-c316MZo_s2pXVIk0MTgiQPO3Jpi8It-BTigZ1BUkQFA.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-1te8WqapAVWkzKTbGW4keQjMpZ15MRH2VWSx7axPpLE.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-m1Cv6DrWOWx56nOwpi_V0Q4IbUaDW217Ur68suNTgH0.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-wcgQsvO8z2OyhmRlbviOh0THlX3IZD5EErja_7IJDog.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 1cbf7c04fa10e297fd286e5ca42a3318e18f96f18514cfdb7ef2f670c752d706> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HL98BPoQ4pf9KG5cpCozGOGPlvGFFM_bfvL2cMdS1wY.pb
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_23_45_18-17552227632398297292?project=apache-beam-testing
    Sep 28, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_23_45_18-17552227632398297292
    Sep 28, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_23_45_18-17552227632398297292
    Sep 28, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T06:45:18.525Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:26.893Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 28, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.685Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.733Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.770Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.850Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.877Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.912Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.939Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:28.299Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:28.388Z: Starting 5 workers in us-central1-f...
    Sep 28, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:39.717Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:57.653Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:57.698Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:03.011Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:03.042Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:21.921Z: Workers have started successfully.
    Sep 28, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:21.948Z: Workers have started successfully.
    Sep 28, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:24.353Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:58.263Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:58.476Z: Cleaning up.
    Sep 28, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:58.541Z: Stopping worker pool...
    Sep 28, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:47:51.628Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:47:51.681Z: Worker pool stopped.
    Sep 28, 2020 6:47:57 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_23_45_18-17552227632398297292 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 642fe60c-3310-43ce-a1b0-c85d54894ce7 and timestamp: 2020-09-28T06:47:57.102000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.116

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:47:57 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 51.511 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/qtwauf3rgqpas

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Mon Sep 21 06:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.192 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1048

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1048/display/redirect>

Changes:


------------------------------------------
[...truncated 270.95 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-X9zsSq6L4jaSwzp0_JQgIXVLeA5Ao4ArV1NeSNF5OI0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-4CNd0_Z09F-ua4Zu2pWdHiOenOilMIlJ-SKMvfI1ehw.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ULFt5jU_LtBajqcBCuhtDch9jB6x-rAdg8Xpti8Kyvg.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ugD73aS8u0jI-PVciT76TIyrQrBKaUpwPuD1uHPDd_0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-HZ6O6QJEk-boNyijs_TF0XhY4xH6iE2sj7HeGQ7K_n0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-UKLr2BjT56D8xAWOHpz2v7PAi4NLqloetiCmlBkuHl4.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2553409351212571711.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yIn0WmE8ssNPVN4FK6iBhNIrKtO7Nluhj96O_80u45M.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-YRDAA3dBEYu85-vei-Eo0qocLKAXoS3dhTTwp3hBhOI.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hn9V-U3blw1XcmtJI98A8pAblD_av4K0hJelGJyrOKQ.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-qj3Pt3Bv1jIbNrOS7YxnM8JvQoCqlzESsbfG8b43aPQ.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-X9zsSq6L4jaSwzp0_JQgIXVLeA5Ao4ArV1NeSNF5OI0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-EblEc17xHi4-GQEM5oHWEiwVRB04_HIFl0h0-XcMrzw.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-xoDmHC5v2tBB2kH5UuBkAoU_VWdYxlH7ZjJfzGdUO08.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-txXeUrLz8-HNqDLFn8X5vUqJQnB9Sdkm_XJeFEBK0xY.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Ae60p5bby_GT_949U5NkcdqZbWBmWxrdLNIRwHXJoA0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-lrICINdkI7WpAoLQAYBJJbQzvmHAnXsrnJe_i8o1ZNs.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-gnfQM3XgiD5ix0NRf4jtKnl0JFVOnobXnTcnUQ-ZfLk.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Ng1oP3RST5hxRIn-PpnRb9Txwr1ralLbVA4w8EHVIxo.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ZDsXkhReuXFcJ7RgCZ9wDp8y4HdNe1ah877yTUwqiYo.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-HBjkiVg_kTDaGaacZpT2yU01IW6Cf6b8VBlhOZpvDuM.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-sDw2681SBtyftX5d39mAChFT7m1C2tCgjGtpUPZJ8wk.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ECUdTJFpH5peOQWopsrgu5jxZSZmtlG1_wb1yJCANbY.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-CrnDvnDjm_ttpvJr5vJt-6m35cxPQxbN4ef6gr0vfJg.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-klMWnN9IaoyrQXWISyms7NU3yWNW9Rs1R6V_Fz-4Ww4.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-V0a3O7jlnm8gnRU3m1dmNiF-669MOPYxfV_8ZLVGf5E.jar
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-iOLTiVprjsSi1BHnMyIH28S79sqcW1P_pWCaY72miFY.jar
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-CiOixFjGtXclGJHfC-CsqhSzn0WDah78Tl7mi0FGijg.jar
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash a62f5915a6189a98d8f92165b6f73762744ad91eb6738a7eaf5a27b7cbd4d3ad> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pi9ZFaYYmpjY-SFltvc3YnRK2R62c4p-r1ont8vU060.pb
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_17_45_22-7149177217992534716?project=apache-beam-testing
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_17_45_22-7149177217992534716
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_17_45_22-7149177217992534716
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T00:45:22.453Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:30.178Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:30.907Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:30.970Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.007Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.081Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.111Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.137Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.158Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.607Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.665Z: Starting 5 workers in us-central1-f...
    Sep 28, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:36.609Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:57.593Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:57.620Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:02.892Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:15.788Z: Workers have started successfully.
    Sep 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:15.824Z: Workers have started successfully.
    Sep 28, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:49.659Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:49.805Z: Cleaning up.
    Sep 28, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:49.880Z: Stopping worker pool...
    Sep 28, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:47:32.233Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:47:32.274Z: Worker pool stopped.
    Sep 28, 2020 12:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_17_45_22-7149177217992534716 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ce22a18e-1052-4b0a-9094-4bac16978518 and timestamp: 2020-09-28T00:47:39.307000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.721

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:47:39 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.057 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 31.226 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/7222l5qrrllzg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1047

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1047/display/redirect>

Changes:


------------------------------------------
[...truncated 271.07 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-yFhG7gtAU3n6z9C7-ZjQ0wayWESg37lQz3ZtuI2O1gE.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Svhg5zgjmTT78gawSeEj1s-nLNTYxPy2_nq9iWgqTbI.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-1SlBceqzYlSmJl9z-rnCTaYPQaPcs0c37nHCp7OJTQA.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-0XmhMPGPY8gjJlcMscrSudxh-IiW7T1Iz4dDLD8bhUo.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-aUzrAbr4JKSjmNZ1LxRrpTrXdgqmhdPPy4Li50nHfs0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-aD8FuranYXmBKfEUNbzfHRKMWUEBh4JhwtCM9a1aae4.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-xV12WbqeVEkum18AE1j3WOAmtSgF3BoBVpnQKu29rq0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-if1eDKeABKpDyWdC5LTmTrWCxCnqG2zqTu1JqJh8a-Y.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-WH4kJQibmfbnLTgj74Cgx15uTc1lJQJ2fWuOn80sMKw.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-lc4znn2cfaX9kwxJcSpbStv6ePNrc5N1T-2eTbe9OAA.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-jTXV2CA1yrPfykNklABnIQojx-cFPZt92e4Uf1rm-6s.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-I2bp1G0J02AjGhMnujEICnM3fkuUS0i_AKnmTdD2knQ.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-kJc61cyaEQPkfs5uCYleEFxncH3pf2fru0ltJ0ccu2Y.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2107590513650399029.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ijOcYzRamsK9q6he9LO_4rOr5ip-1OeSvRM4rrSGUiU.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-RDbLaVz2-5raeLho8ETwpa9LBYifQ_jGrq07vPXpgZk.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-D2iO_pNIUsqkmH18knkHfhOAM6Fgmd6bXHHri0qPqIQ.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-g9jxBtRFoKym9Lhab6pfFyPeZLSRRAO36Cp3kJjKH20.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-4_t9g62ZIkVcH-2gJwQ0I4Al-XlCyQAK2RMGjddBgWc.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Gqj4LsEC6CqKwnrYRkBR1LADYlbFugBers_9dYqzQG8.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-4KgP9uyLOp6fnWUDVXaSIijwYUcpbUXBTWYxl80SeH0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-VXdIKt12g0mIWwfcM1SpXN1WPBeWNatu_2QBv62Xrcw.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-1SlBceqzYlSmJl9z-rnCTaYPQaPcs0c37nHCp7OJTQA.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-jtMTWPMerP3pkLGeGIW_kpu_r1M-JByLxXTMsbMYgOI.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-kSLD2c0PnE7r1Jo5ssXEyjXZw2bYpkrXlqk3qGmxzYs.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-H8HDKfg3KRmN1qaQqJNnYIUyigFdarYOIscWTYLM7BE.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Fhgw4OxKPImwKn6tAkLjYrkowMQYHY8sHE-_uRlSpR0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-xz_NLQ10hEBttMY6eL-EVAC1cjm-yyjM45xIbG1KOFI.jar
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash b00424351d783dab1ac50c2c78b5145b560929f300593fb03f3f7f9bbbd8a509> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sAQkNR14PasaxQwseLUUW1YJKfMAWT-wPz9_m7vYpQk.pb
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_11_45_19-3032873533213885387?project=apache-beam-testing
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_11_45_19-3032873533213885387
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_11_45_19-3032873533213885387
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T18:45:19.639Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:28.159Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.299Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.340Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.368Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.424Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.464Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.500Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.523Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.996Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:30.074Z: Starting 5 workers in us-central1-b...
    Sep 27, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:51.453Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:57.584Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:16.148Z: Workers have started successfully.
    Sep 27, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:16.185Z: Workers have started successfully.
    Sep 27, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:47.391Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:47.537Z: Cleaning up.
    Sep 27, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:47.622Z: Stopping worker pool...
    Sep 27, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:47:37.298Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:47:37.348Z: Worker pool stopped.
    Sep 27, 2020 6:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_11_45_19-3032873533213885387 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b1c49999-b53b-4675-91fd-23375b6e39b4 and timestamp: 2020-09-27T18:47:45.005000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.342

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:47:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 38.518 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/k23arrkid3nsu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1046

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1046/display/redirect>

Changes:


------------------------------------------
[...truncated 270.77 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-0Oa3kLLkZl-SNg_eUkmwErkQbd9fm7_vTbVYSF4CHeI.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-RoeBP9T4e7uC2HS_Iopc_zFjA-3Z5IQZR97YPOolW4E.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-WzAp1x4on3eBzxkGq4ewl9j6YbKGIgoROPJvsvqf4Do.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EOl2uVLiPDB_sN6RNEjfR28LU99WjUgc0jTyCzVjsUg.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sGCZIE70PyTHvvFpx_8Zc2q2ee2mpEahykRiYUMn6Bo.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-qpetJ6EzRdpzsZxmWlxQBtOt5LScxppRpF6LjUNc1NM.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-6QPEKnxwqnM5182JXL3hLI_pm1eGmp2x7QvHeQEP23c.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jQum4yiy7uePfoGWyhkUHnQnToc0_ZXr6vs0UeURNtM.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-UJ0TVdVA-A80hKEYpIBRxuLdxHo_jFV0aYQGif6wTUw.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sGCZIE70PyTHvvFpx_8Zc2q2ee2mpEahykRiYUMn6Bo.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7341994116803359113.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cl0eD3J6Kb4R9p9aJ1-22OJNfna7s6bXaZdnivinNpE.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-IENsDlGbRsfRc2NTeo0var-iVFzE0e_BCDh9Pn2xnEg.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-XgZtCKw8auZKVmDlD_wSIEmA0YSspBLxRIr9me4cNBo.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-PQhEETNvb2f909PNEe_a53IxYDIaDsa2FJvDiKavxFQ.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-uyr13XO0gRPgK2-fXxyC60cMCtRhNbvGleA6dMEFu6Q.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-LFNSmgRBFrh9rYMMjx4I7ma0KIXUBAfMzX7NpboSOj8.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-qckjG2LATV0dBLNVhPOpuYNjEFwY3QAdy_xzaw2S1sY.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-aFTahuwuOyVZTdg8tm5gjtXV9DMQKPqsXaS9BMeFCCE.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-guAUKMpeUoP7DDvGYAhhTI69v7IHaw6SC8qwEV6ztBQ.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-29vp8A-cNpPthLx_YMK1pmKEbWnjIejYye4WTNJT6EE.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qUf1JmSvHh_vKxKowYZRN5Nltx0x1b7U5XCbWUItpzg.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-zQEXJcawgHXkpbcgHRlGtowU5vT7-YdQBex-fcOQH8I.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-vvT-tt_qMEVB5rCAPBUvdcE8VS-Dp9YNe2gCWENaRgU.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-sUhDCwbr4VP4WUrRph98emvSLSBr0Z1HwXny-BJr9Wk.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Jcto0iKeTfb-5mLYPiJko9AH53HPpNrywC16ElF1Q_c.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-1z_Q7yltJilAvS8iPOZ7LX-8X4n9xKVD9z4cUSEHitk.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-DEp5jWpkb0u9bOxOqgrkEf6eeTamcQ8lKf18PVU9qu4.jar
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 16aacf788c582adae916a986248c95796f14e15822826df10471c0dd5002d6ed> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FqrPeIxYKtrpFqmGJIyVeW8U4Vgigm3xBHHA3VAC1u0.pb
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_05_45_21-14364124214925366901?project=apache-beam-testing
    Sep 27, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_05_45_21-14364124214925366901
    Sep 27, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_05_45_21-14364124214925366901
    Sep 27, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T12:45:21.901Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:30.216Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.291Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.328Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.366Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.478Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.515Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.545Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.583Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.933Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:32.008Z: Starting 5 workers in us-central1-f...
    Sep 27, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:41.161Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:57.444Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:14.850Z: Workers have started successfully.
    Sep 27, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:14.887Z: Workers have started successfully.
    Sep 27, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:50.647Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:50.777Z: Cleaning up.
    Sep 27, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:50.848Z: Stopping worker pool...
    Sep 27, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:47:40.892Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:47:40.931Z: Worker pool stopped.
    Sep 27, 2020 12:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_05_45_21-14364124214925366901 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5bcc397a-ba91-4c80-9394-a501c3a0301f and timestamp: 2020-09-27T12:47:49.220000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.731

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 40.357 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/iz4gvbisr3gsi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1045

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1045/display/redirect>

Changes:


------------------------------------------
[...truncated 272.47 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--KNaarnxrpPW3qQgdOXJXcxNDXXI3NyIFDBI62aF9Ro.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-u-wyZ5ecK-yuiJk42w0a4wDo0mi-ywAf3sUxGaGhQcc.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Ph-0R-WxK9wvR7bGw1Wwy2iTVeDvtCXjHEpohbjOHK8.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-3Q9kwim4H_E01heNHLQZvgL4U0mEe4bLLmdxW6mrxLs.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-TsT6ExhdkMf7QPJ9fHPEElG0wbk-no02zeCf5MDvY98.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-UE4d2EuQTCVng551SgwMwP6-brDWQd-8LrFlE39l3dI.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-c8M_fquw6vLPj93HphKoVEguHl1eqTCYzEZ1JZLraIs.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests--tJK7RyO4iw1_cv5zomzdpZJPRFmLzWtnYKmwVRRFNM.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-QJQUvv5cnTNvQEfhPLtu3RRQ7KUqTU5AxwxkfPmOvVM.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-J6SlfskHH7kJTwRq2S8SHK0zHkf9QGei1GPBUwNrmvk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-V5Va1LBFToIoM4OuVGIOOk9d7RqyihRbKsgKNWjgzXU.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-wezn-1eanifmxnQozwgyTb9g0C3QMoQCF05XwqftTgI.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-xJ-sTivd6JxWHjA_wIhJWyBZYaYqKTwMoj7aNR786Pk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-rZRmU8hqyUCjLv-HyorH93ZbQMAnqon1G8C6Kve1FG4.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--KNaarnxrpPW3qQgdOXJXcxNDXXI3NyIFDBI62aF9Ro.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-GiLX55nsEiJdlKa32qCVYH0pDHlr26z4Ay19vmWN8m0.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VO2rmgR0gNrUC7U6lEdGrXxVlXkrllrG7JEXD0Xmrhk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EkARUIUC2bMj8q1neKC51Pd1Ax1XJpmLl9bt8CtohcA.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-taZbWgQUih13ihedyOnHlJTa0YZCCGkPJqK9poaDKUU.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6755071685026074838.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TdoqANYs9oSLd-TfFZ1CVDk46U0nV6iTX_sUj5Whrzg.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-SuAUip96Vm0vKcNgZsZEBDYJ2ZbFQuJTt-oPQHwStuY.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-4NuOT76H6uhzcA-EwU6lg9Y4dOI3fIS0tQfd0Ka7uNw.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RTKMuyKLfjtvdxtmSIl343IhPfmGHSFDSPR0Sp1lJnU.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-6sUxkRzZb0D43oQ0XWIagCHyX2Bwygd75EGSB_Zq794.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CMgTfe-wz-GmWGLhYjPoJu-UqB5UhcXTG-_xuSQ-ilk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-6eSyyB5mkN2Ll_mEGTL_OK7nxS28_T8CpcYyyDar1QI.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-GF8owdSpfh6qPcKjdfvk_VqLj5BlZxBHObLsyeUaSLA.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 7048a731173408deb4551df98b77bdc1c8fb7baffbb3cf49aff76dc841171a8e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cEinMRc0CN60VR35i3e9wcj7e6_7s89Jr_dtyEEXGo4.pb
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_23_45_22-8388666750052355909?project=apache-beam-testing
    Sep 27, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_23_45_22-8388666750052355909
    Sep 27, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_23_45_22-8388666750052355909
    Sep 27, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T06:45:22.584Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:30.093Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:30.980Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.019Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.043Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.121Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.157Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.205Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.241Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.981Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:32.064Z: Starting 5 workers in us-central1-f...
    Sep 27, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:55.770Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 6:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:05.675Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:16.641Z: Workers have started successfully.
    Sep 27, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:16.672Z: Workers have started successfully.
    Sep 27, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:49.137Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:49.271Z: Cleaning up.
    Sep 27, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:49.403Z: Stopping worker pool...
    Sep 27, 2020 6:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:47:39.673Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 6:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:47:39.713Z: Worker pool stopped.
    Sep 27, 2020 6:47:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_23_45_22-8388666750052355909 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 98eef766-b6ec-4d3b-8b96-c43eb0d999f0 and timestamp: 2020-09-27T06:47:45.268000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.385

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 37.909 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/2amlcia2q2enw

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sun Sep 20 06:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.199 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1044

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1044/display/redirect>

Changes:


------------------------------------------
[...truncated 274.08 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 12:45:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-6sUvo-j2ZEuTKo7ezNDpccgn8LX5D7XkHHOU1IiR03o.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5867853826443468948.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BicsVDn_04K9vOyjbDTVX9oRhyYKO5XMXHMv0FKk-bc.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-R4ElzhK-SzP7Ov6WKhzjeq4_WrodWJFY979ikB9_AaA.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-5pZEJ18ejIdITf52Zsj9GOwTNymJo5ia-MOQvkj4dXU.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--b5E7SjUjz4MegFIeUMcg-lipBIva32XOWr_bOEaqsI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests--10Eg3njy1iYNzLgTgjvz3Lq68Ba9Nc4APd1KSFjZ4Y.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ZH5soDgmXoys1eMruh4gBSr89hWrnB6VvHEXpAgYxEI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-jy_PANLdHZMTsdcAlcGqQwLb0czGeSgX9HcOS1M3nXI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9yjfxihPsKh8ACNonugS-VhPUPCYgdxE9TSOre-DGrU.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tJvF-LMtO9Wk00SWEnvCVuVbKdHI73Ietd57J2ep7hg.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-3D0MXUgZgSogB7beiNPlRLk2uKY7odPmAKjlWNlrZZ4.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-uWZUZnu1SYy2XIxDip5d0GDdMQjbPyp6DS3k9ctqZZ0.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-bMHB7APRRzZRnvvP6ugF4T_BDDWn1oxeZ2lp0UNXm7k.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-rRKAY51PkHcPgF-fezRXjhKHFwYMBULm1G3DyCJWYVI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-U_aL5a9pcLxwBxCREhg6dHkIraZJd5OxEJ7qw-KLY7I.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-6sb9b4hsn52VN2ku1LjeVwVoOv2HS4r2A52yXhu3mOM.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-2eriKnXDWmH021xzL_o3MTeo7SzF9sDVYgW9eoMLAB0.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-6sUvo-j2ZEuTKo7ezNDpccgn8LX5D7XkHHOU1IiR03o.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-8nrbO1TWH_-4PBPrRpAU7W7MJkAyKBrwj2PobzWp7rY.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-03fFfPG37QNmODmUK-989fcutqrqvVcpHeeWIN_Bbgk.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-NRTiM-AQicTX1GFzjw2AZnUPotGPdXISdsj-2zZFAIU.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lYI5wZga1PbYnI5V8PKH1AP5bzHSvFc5TDU4OBK5G5E.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-xF8sgxNG7vx8yDOQGibI9l1Uepoau0ZFJz04uBQ4OpI.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-YVQr98Sp419mwkaHB966DOF8C4SHWD519_GEuH76Gc0.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-RTYTJREgxRDCOZDLMpe2X4diQJlfHyuySrb4Oi80P8U.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-oB0ypzqttOtavk_Y2zFqDzxiUuebtIko6AUxrbWYZow.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-SLGixNvFZ5_6qogVwLmt9JuyI_f2SePGwVUfzguJkwI.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-u1aY9Q8GnCu6UiVDe_SsKlnSZ5__GeGhaEZP0RU-i0A.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ndWIL-tv0PGXGQ4nRkLnmMFKjIV0BsV2ABdlOWhq5oU.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 28 files newly uploaded in 0 seconds
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 30b20c6f02991b5fb08178174be1fefa833d547df0934ee6e2fc25471e52a4eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MLIMbwKZG1-wgXgXS-H--oM9VH3wk07m4vwlRx5SpOs.pb
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_17_46_02-554644357181342224?project=apache-beam-testing
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_17_46_02-554644357181342224
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_17_46_02-554644357181342224
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T00:46:02.481Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:10.652Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.210Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.247Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.269Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.327Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.364Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.401Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.428Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.762Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.847Z: Starting 5 workers in us-central1-f...
    Sep 27, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:35.038Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:35.613Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:56.155Z: Workers have started successfully.
    Sep 27, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:56.179Z: Workers have started successfully.
    Sep 27, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:47:32.185Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:47:32.312Z: Cleaning up.
    Sep 27, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:47:32.408Z: Stopping worker pool...
    Sep 27, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:48:21.393Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:48:21.434Z: Worker pool stopped.
    Sep 27, 2020 12:48:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_17_46_02-554644357181342224 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5edf6410-6ef9-4340-b0d6-285ee1d0a118 and timestamp: 2020-09-27T00:48:27.717000000Z:
                     Metric:                    Value:
                   read_time                    16.875
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:48:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 40.607 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
107 actionable tasks: 66 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/jbcnkenilccne

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1043

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1043/display/redirect>

Changes:


------------------------------------------
[...truncated 271.37 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gEMvbcJ0QFPHhI3nM31ZexS5ZXPlAsVGMiWAasdl5XA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0QQYYdSQ4Muj0utkBGIhfBraRWXN1KZko_Ja4O-R-_s.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-__TY8YKKu-dcZQZQJPmUJ_hOisz9uqlcHGYEMk0_S2A.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-MuIDUK5E4fkigHSRSOfdnSn7dw6VcNfB1fItsoaU_VI.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-jfDAqb_UUo-hbJ6XcYVU3Wgi0wFxC5akO_KRU0XhICY.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7om7bRBojdJfbwtp9Wcdy_olkVjxBEp7BnbYVbaUHgw.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-BV9zLqyQr2QyQwM5MlsqZ54XU8ZAJkBUVDRozD6tt84.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Tae6FerTreP05Bl18T8fqeJWZfnoi513UpO0Kug8gt8.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-YRRFd9VokoiwHqqqDwLHDHqVZBRRC7nWy6XsSjppAsM.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-uvH8mM2QcM7-LgU2Tq5pvdgX3qG9fEJTXgTfDIS8YDA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--1-d3Hdw9hafLOYK6B9sFtbqPyRS17PegCB-v8upNrI.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-x0Qp1MUOT8DjHjZiWOdTJ4CP0JEa-GvbskuSx2EBekg.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-69Z30Qj8UhluJJsGQdjm0v5ok8bFZbMqoEAcwFNg_pA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-M0xQvH47rmQepqEDNpiraShymVetldqmaEyrXgM08dg.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Zl2rejjC4Cp-zgOsebq2J7ZT8F70sZGHgCe5xYuluh8.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-blRMSQ37fItVK5TtivnzdXXXcP1s5n-0Tsd4cU8Sa-Y.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-OIS-asTowK8MGA1puoHEsjK6anS9g_yhQxwvLS0bYJo.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-UJYMj-dy7rZY9ISNqHiCQqh8hAFzHypgH2Uqub9g6iM.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1YTSaGg2lWpQ8xjOQdgXOx8IVOV2qSwheZdweeeT64Y.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-05AVyeprOxTQxodQxN3zf_LkCgPfnOu3Goel6MQk_X8.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-4ULsgmt-sM9XfrFrq_65PR3rUBp9fUT98gHdHbDG_rg.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test977875011780437055.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tz48bLChALScWT8OI2tP528332m2QcIKPrvKOrPVop4.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gEMvbcJ0QFPHhI3nM31ZexS5ZXPlAsVGMiWAasdl5XA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-dEb_Ulzke4-knAJ2sPOPINWT7mhBy_OEkZ2mYLiYD7k.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-LPLPfB7PM__4QKQsyEIsRIH-kCG3k5GcKq2PV7RwpFE.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-fZfGL7Hpr551Ux7Srp-eg0rHjrNJS7K-BRTe73S_VRA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-4hSoktn6-SR5FjwzikrMEkupHLAY0vqdvaYVnzRKRtg.jar
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95915 bytes, hash 40edb86630ac4047c0bfca851713dde1521441aeae01a8a17073b77321ed7f92> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QO24ZjCsQEfAv8qFFxPd4VIUQa6uAaihcHO3cyHtf5I.pb
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_11_45_27-1515094053020343100?project=apache-beam-testing
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_11_45_27-1515094053020343100
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_11_45_27-1515094053020343100
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T18:45:27.043Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.320Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.867Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.907Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.995Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.069Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.100Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.127Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.155Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.483Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.559Z: Starting 5 workers in us-central1-b...
    Sep 26, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:46.132Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:04.940Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:22.462Z: Workers have started successfully.
    Sep 26, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:22.541Z: Workers have started successfully.
    Sep 26, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:51.416Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:51.571Z: Cleaning up.
    Sep 26, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:51.644Z: Stopping worker pool...
    Sep 26, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:47:43.311Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:47:43.350Z: Worker pool stopped.
    Sep 26, 2020 6:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_11_45_27-1515094053020343100 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 177a3dc8-496a-481e-9236-9cac779c420d and timestamp: 2020-09-26T18:47:48.924000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.161

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 37.853 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/pcswymruuv6mu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1042

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1042/display/redirect>

Changes:


------------------------------------------
[...truncated 270.44 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cu2eBEyDYWM7FmtP780480JQEAUsLOYohS6SIYGdWok.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7p_dd5XQi3MMfi1wBkLRhBMXiyS9PIi5QRznE_m6cGQ.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ippPn-cs36hH84gDTT14SHILx5ATvPjH-i2CVCppWiE.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-CHAJ2FN4eV9VNh25T1JfiaAz92DP5Z5bweNGzaW-Z_c.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-cWjs6trXdh6roW2ihciNraDke1QSa9nr5PijqHopFKc.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-CjM-vmYL7OoBCnCbho9deRYuQD22iHzXb2PYehNdM-g.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-SJHQB-G9pS0RjkefpQk3EYofi0urYtPZJ305R_ilDHw.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-YgiWAJEIDl0XlPzFoNsLrPkIHfylcUP7nVZ0PMpAAIM.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-qxaGzfM9_FHH_I0YJOexKmvUN2aR8HBYBOhCt2Yapfg.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-612QYmYaQbayH9cJkjSsURyFf4cuLcI90TNqiR9Hl8c.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-LaUHsu-7QtJ629JZAgYodA5iFFtlOZBOVzU8K_Xd6TQ.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-vDPCH7wBzYpvZhPAlQhRFiMIh57YzXnPk4Yq7FiF2Cw.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-3GJ5-I3TucpdoQyfGzTyCgxM08Hzcoqnsqwx42Q6psg.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ngeq9is9kRcr8fHZrj9_FryOHdYwYcgtqkPYSX1GiO4.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ysWRD53WfHzVtMgtbF7aaa92FM7Y3ZVdEYtfDQf2KDk.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-_ob41VWHifZYkCYZ4kQkZxM1e-jLbbuj9JbVLwXnaxg.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cu2eBEyDYWM7FmtP780480JQEAUsLOYohS6SIYGdWok.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7564404547239429244.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aImjjSo_ZIzkO9FjUV1mhJAbUOy7eTwHT8jYwbT6qs0.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-SrW4AIZ8_BVKaqqtvm8vBC57KRb_pAQCmuVWaOhYq_c.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-lz4mwZW93F_5rqB9i2Ntt0bvH7YYfbh2jyXMdL4TEH8.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-I26BrKYwv_4JfM9hXX-JrM4pQXGtGB9WPBhehSvhCmE.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ojEchKmeJFU0_v2amaX6O5IwyzGiq353rR3Sc4fp7mo.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VXKY62yGxvurmaAJ42a9Aavlu8VfKuwfLkhuW1056EY.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-PgmNDa2UmDk7miACExvhL9RLH_2rs1332v6fJ3QFwTc.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-VtyD1Cy1r5hV4_VdYKEPoByf82SZhcmt1o3QdkfHnw0.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-EnrXuhTo_ly_LVGodA9xV-UTrWicyazKzHYWv2BAE5o.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-up7ZYbanGIQtooxEr6TS-B8omfo3kljF1sOQpVu0Emk.jar
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 7be597d9a0ccebb3e283be269d90b3d986027af5e0c0fd5bd8e4d3b3833839fd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-e-WX2aDM67Pig74mnZCz2YYCevXgwP1b2OTTs4M4Of0.pb
    Sep 26, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_05_45_23-17277875205666871586?project=apache-beam-testing
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_05_45_23-17277875205666871586
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_05_45_23-17277875205666871586
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T12:45:23.147Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:33.799Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.442Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.474Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.502Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.580Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.607Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.631Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.663Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:35.010Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:35.082Z: Starting 5 workers in us-central1-f...
    Sep 26, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:00.881Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:08.841Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:20.173Z: Workers have started successfully.
    Sep 26, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:20.205Z: Workers have started successfully.
    Sep 26, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:52.961Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:53.122Z: Cleaning up.
    Sep 26, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:53.195Z: Stopping worker pool...
    Sep 26, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:47:40.371Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:47:40.418Z: Worker pool stopped.
    Sep 26, 2020 12:47:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_05_45_23-17277875205666871586 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ad6b38e9-196f-49b4-81b8-a21ab002f810 and timestamp: 2020-09-26T12:47:46.290000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.239

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:47:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 37.282 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/gtmuwe2pz4o2o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1041

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1041/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10959] Go SDK: Store a fixed amount of known process bundle


------------------------------------------
[...truncated 273.48 KB...]
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zQlyVgiPS315qD7r7PXUvLxFIqUHgN5lmDufmsJNTwc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-mnAPMOzD_Z-jrPJEoeERgXL-Q0v2yd3MNLyCXODcAIk.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8785051967837179816.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KRnvjdIkmFeXXm7i21CYxd6buXK7YZ0Mr8Fq0TNkBB0.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-OaclGtPsdzjswp2y8uw8FRy9TEPzQ_y_nwUzEhVqbD4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-g6Qg9nFf8PPjpdAMTQcfufl5fz82yVeAOVaYyMbfOv4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-kzeSJgYDVN8rDbgNcDLkVXCbLmEKUPcX6JchOU4DVmc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CSj21lD_PnxLMh0BoyLruEamq81h7iA0MHeKB2vDcNM.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-0-sPSHP0c6qXx5dkRUQ1T-06raTMvVJSy-HOUGEEMFg.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pxuBY66bOB5_pKRYFqL22cY8qlldZAgSbe_EirYxys4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tXKhU52N2QRX2Pb8Zx5hf7_qAbpYb5FK0CBwm-dg__M.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zQlyVgiPS315qD7r7PXUvLxFIqUHgN5lmDufmsJNTwc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-2ddHc19DIgUlb-wubMMeD1J-GxGHSUMicCLHUBkKSpo.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-niysu5pyZ95eP-9jao6MARxEjem5OiU91ZYD5zmOjrU.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-FM6w6qdONj-rkTfXvS8u2VTipPLUad44avOljjhET4s.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-dfM3nNIJegL_1qrZXDc16qqmie6uugewdPrd0Bcbtqs.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests--xeFaOZ-2sbD4k8qwD2yVtMQOzjGmysMhyrDqY3g7q4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-bErCNTaZuzTlFKFQleeerFe_Kz48FArmyAIIXn9KvKw.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-JfvjoS24S8PCLdO-89vRVsmh71dGP8R58HIeTYwULHk.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-7UzWALl7niGEltxp0CpjroVyNLMv9WkW8N8DjjgSdeQ.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-2QeoLal9f2G_i06WcsPTJEmjMDYgNDQUjXzJTbI3NSs.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Y2CC-ERBAHyDQoRum1gnInxjdCu6piy1n9hhMxvOi4I.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ofM1hZ-Ag2KHrWopeZTkT7-Ii7FCkhnvkCDq0UxyXAk.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-x3P8zsArvgJuL5qtN5J9E7870MLjoTpSUxz0OkgXZu8.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-VpHL6Fcs-noLeET4JrkbJMx60fpJ8YqwV0WIEb8eVA4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-tvGmWVwnJ1URMkR2duwgXB4fxVtLZbLOmlPGE0xUQtc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-6-jj58To92qnIyVJ7KNRs5bKlIlxdIxsudAMJVfU2kw.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5_J2Ul-7w7vsMJE8fGFFyhO_yRyT63f8n1z-KEcf_uQ.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.14.3/b90885e30e86eb454e7b0e8e580cf59616e9de39/kafka-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.14.3-ITKoa6D-lUtLem0itUO_R5B5PqtFpvvOjJf9bTByVUk.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.14.3/71fc82ba663f469447a19434e7db90f3a872753/testcontainers-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.14.3-pumfUOVGxIRJ3HrNRtmRWTZcp5hY59_klYGQF6d6puQ.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth/tcp-unix-socket-proxy/1.0.2/cf53989130986c60113032e25185f4496ffbc186/tcp-unix-socket-proxy-1.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tcp-unix-socket-proxy-1.0.2-KCMDGSy00Z-cM3qmB0jDR5sQSJV17Q8Xofxkoxij7SM.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-common/2.0.4/b4d1870bf903412533e0b79c6fcd402defcfc05b/junixsocket-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-common-2.0.4-r8N2Fez3-t_3TSmvtEP-T2M9OWZG2J2CXoIkoneDn2A.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.jetbrains/annotations/19.0.0/efbff6752f67a7c9de3e4251c086a88e23591dfd/annotations-19.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/annotations-19.0.0-Ev8B7q8MCcamjy7AJLO_n6TK1uaLdLlov2LH91kEcDI.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-native-common/2.0.4/726bd66a934dea39c817382986496fa4eda96411/junixsocket-native-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-native-common-2.0.4-92O4XsFT2VMJB0dOfyBspSsocDfXBLrO3aON1cTQ9gw.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.scijava/native-lib-loader/2.0.2/1451fa03954c5e31a358b411147de472b4dab92c/native-lib-loader-2.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/native-lib-loader-2.0.2-5WfHHp8_9T94vVj9a6bUcc4x4SY_XofR4fzF0-2h4kg.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/5.5.0/af38e7c4d0fc73c23ecd785443705bfdee5b90bf/jna-platform-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-5.5.0-JNgWIfgqwp_N2adBFgMfWQeiNDFY5hb0Vzu_okNK4NU.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 37 files newly uploaded in 1 seconds
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 871a62452919f18072d69f7132e2dd9510374869de9a39a2a6b41cb7213349be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hxpiRSkZ8YBy1p9xMuLdlRA3SGnemjmiprQctyEzSb4.pb
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_23_45_21-778638351766346219?project=apache-beam-testing
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_23_45_21-778638351766346219
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_23_45_21-778638351766346219
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T06:45:21.387Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:29.450Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.626Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.659Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.691Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.768Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.797Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.829Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.852Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:31.373Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:31.451Z: Starting 5 workers in us-central1-f...
    Sep 26, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:53.495Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:54.771Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:13.978Z: Workers have started successfully.
    Sep 26, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:14.015Z: Workers have started successfully.
    Sep 26, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:45.447Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:45.614Z: Cleaning up.
    Sep 26, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:45.722Z: Stopping worker pool...
    Sep 26, 2020 6:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:47:26.924Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 6:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:47:26.969Z: Worker pool stopped.
    Sep 26, 2020 6:47:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_23_45_21-778638351766346219 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c6628144-2ddb-4f69-935e-3b9ce8f5f86f and timestamp: 2020-09-26T06:47:32.808000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.502

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:47:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 25.887 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 16s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/qvrbwhpbonohg

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sat Sep 19 06:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.704 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1040

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1040/display/redirect?page=changes>

Changes:

[noreply] spannerio.py doc typo fix

[noreply] [BEAM-10975] Remove capture_output argument in sdk_container_builder

[noreply] [BEAM-10959] Store a fixed amount of known process bundle instructions

[noreply] [BEAM-10977] Disable codecov annotations in GH


------------------------------------------
[...truncated 270.14 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gD0cssr_Nre30SwYn3vE7C66yFDX-nbOpX8G6Rr1o4I.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8795106637567226117.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-182YcuIBAqsbc9XXQrgt9z8lziGP2aPWrzjQ9TGhrJo.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-vfb-NjDjkw9Z6EBRfY4CCnHX5ESFNpPy8bhH2RkZZOE.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-vaHqsb4jBdoMouVhGkkLO8VJuyjcHsEn02Hfe07YB_Y.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-XzG5ZP_HT06ZU2yLejT1nSLVCWLLdEjcr7jJYcz1kO0.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-aoa7nlV7wQJZx6Ntr5ny6fj6W1KHkTzWmFX_BY3ExrY.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RMLrVvBmihHPPQBEJN0rOxs0BwESBlBhJ95owhPbASU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-JML0ThKyKX8agBUT1ujDf8Z3swYoWjs9TiShalNGoUE.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT--huXLTjAanmoM_YVOGQpeqQjcwvDxaHNh8c2kCj_JKU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-L9gC7xj4Sj_N6Ss-PJ9dQhQYPH4SPsFwo8dC7m2p-j4.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Xk0S3nbeJe3IwRVMGmEYgY7j1IghiH-svmO___BXIgU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-GON0UK1rHGubzwPJH76vNjflb8_6RROzdZ-cGRC7SZ0.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-z3dXZQHhwk7MO2fuCwNYPdXZyUo_9-LI7fNsTrtyOpU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-RGsa7WI14w_XSif2erNBOLpAKTaKWwiG3xoSr9RVstw.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-hUmucsJGN_dGSvAdaMjYAbAtq5NAJYNaMmBCcigyXs8.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-bUYNv5dg1Ys9B4j7bf-LIqPwJm-tzeBpl6i9VDapT1g.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT--th_JXrcpBw_2JLD1-XWmFAd7ah9QUFCv9Hw_uEIxMc.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-2VL-IUR3jxFRWzEdkIbvGQ7OS88c4k3htnRAUwdrZ6g.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-NE7aYXHgOEakpiFy7KCxoLN6WPJxL0ohfYZ7vEmu_Jk.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-cG_dqu0kfy3QjTSwF1zawkPdUZ8AdG68R-xoH2AJSlk.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-fhSZPawiM2A1UetpFnaqvgNS1UTpHP5Az2jZB3yq290.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Ab3RYGmXmHnzzq46fziiHYU0F7NuADI8JfsLoGfvqKg.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gD0cssr_Nre30SwYn3vE7C66yFDX-nbOpX8G6Rr1o4I.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-NxPkraHyJAdVcI-Y40vKnop38HbS_Ru93V-2ES1aDw0.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-OCmzxJ_4Y58RNJ5HoHwu4Mp2BB271fxKXeGgMUX3TvE.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-SoxvWIKHeeUe1wbsLzYoZ7fCVSDj5QW3r5VWlle143Q.jar
    Sep 26, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-2H-S1pf0IDD8wA5EQ8ZpoEl4IP8NadB9ZE_LAlbKMI0.jar
    Sep 26, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 26, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash ffeae5bedc95e0a4611cabe589a0197c4e8e28768f3bbcad3ee8e632205a1cee> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_-rlvtyV4KRhHKvliaAZfE6OKHaPO7ytPujmMiBaHO4.pb
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_17_45_18-3924713643940463119?project=apache-beam-testing
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_17_45_18-3924713643940463119
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_17_45_18-3924713643940463119
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T00:45:18.419Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.079Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.733Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.777Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.818Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.878Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.918Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.947Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.995Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:27.368Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:27.460Z: Starting 5 workers in us-central1-b...
    Sep 26, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:58.794Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:58.909Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:16.306Z: Workers have started successfully.
    Sep 26, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:16.343Z: Workers have started successfully.
    Sep 26, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:54.013Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:54.215Z: Cleaning up.
    Sep 26, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:54.351Z: Stopping worker pool...
    Sep 26, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:47:49.554Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:47:49.610Z: Worker pool stopped.
    Sep 26, 2020 12:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_17_45_18-3924713643940463119 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7556ddf3-b0a6-4d3e-8959-dcbd9fbab40c and timestamp: 2020-09-26T00:47:56.128000000Z:
                     Metric:                    Value:
                   read_time                    19.008
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 51.503 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/eqlgksxmcbzde

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1039

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1039/display/redirect?page=changes>

Changes:

[Robin Qiu] Update release guide and script

[noreply] [BEAM-9616] Add RegisterDoFn (#12903)


------------------------------------------
[...truncated 269.54 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test68209336544983503.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-R7pM3kHKQ4IhBo1WX8noSLgXgDwI6F6oOReZ5G2O30g.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-2G7S5KEMDYiE_UXE3XYOo_pQ8iBdPYW2gMXD0vwq028.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-VonJKzQBO6CeHcKqgFZGZLYzUewrL3MYMGLM5Phg3N4.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-zNw86IOpi7BeF2xLs3VTviZOQhymHWVZ4uLhAwBTCxg.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-kw3QtIKJBY-B0crw7JtI5muTxlMiDal-9Va-X2tA9vY.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-npZE5ffDe-UZpgYkUL1ypv-L8ZQC8ife2AlubGIJJd4.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-TEjAmnDl4JUMAFyZ-2ZsK_Plj-K3XPxnBTFWqa0Akls.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-59sgTLv1FpAR5LU8mPQLfvWQXXRLNxzsLQcvjAVJkKU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-8lOaUZ8qEifHhQhzocuMrEtNqBtbdR7o4CIi5Gi6kKI.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-poH27Z0Btqcy-7gmNT4bs2yucL90xoaWIVX99ZaVhB8.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-VrN8-u0wTtVFw5oLC9w-xzMO0rJ1BaPBzU0Gs8HfgLM.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tX0mukQ_WVA2k1P2-IUZe06mpTLTXHz8E9TOmjPTtfs.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-59sgTLv1FpAR5LU8mPQLfvWQXXRLNxzsLQcvjAVJkKU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-mPV6HPfEkJue7HIPOnl2RScjh-HPael2ABM0WDHdAI8.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-MyCtXh1Zoh2dkZRYkJnJ4cZVCbpsHnkOofOr0pgVNm0.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-qhar9h3ENYJxn6sJPE9YYzfCFyshkOiN2QOzO5GWNFM.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-2vph_3W-ZB1wdcoiom2FW_vxqZE6dtPqfjGbXODvEmQ.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-2kymy4nG1sUrigstw9u2Q4vytwVHJ2rkMr2JOH_FN8I.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-dGstqqSNR6QbFHUgPdxRGEl-iban9BqxUQKXJNGZ88U.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-oFwZ_eElRFjLSrJ_IUWNop_HU41a2LDYmqple6UAofU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-t6E3pqN36TCkdRJ3PWMmtOYda1A_PaLQfPJEt8cBfvE.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-FaCkTV9QL8g4QSG7fgLnb2t80oDrcxkvLIYwWfbPI8c.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-GIddA4cFJ1fcTjIyLM7Ue6rUnoraan8KOvjH-c36Lmo.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9WIgyOqxXL2ETM4SvKIS3JkZeNWCiJiFDWN_uhtVrWM.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-59iIkDUdnkT8Ygg1YLAFawtqd4ypWBa4mmrI9Gx51vU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-XC76sFBLgNoWsiTaj9L2ZQZ-5GbOv0M7mLq6zUeGSCk.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-v-ilty99BFTM7uC6sATiXj1YQFFC3Bk_strnS0ZYBKc.jar
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 4ebdcbe622e77920de0dd3a91d4224ac4d38c400266cef855295de0b1a232227> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Tr3L5iLneSDeDdOpHUIkrE04xAAmbO-FUpXeCxojIic.pb
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_11_45_20-2334548006027065858?project=apache-beam-testing
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_11_45_20-2334548006027065858
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_11_45_20-2334548006027065858
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T18:45:20.864Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:29.439Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:30.775Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:30.954Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.030Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.138Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.191Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.246Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.300Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.936Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:32.047Z: Starting 5 workers in us-central1-b...
    Sep 25, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:46.242Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:56.835Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:14.619Z: Workers have started successfully.
    Sep 25, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:14.663Z: Workers have started successfully.
    Sep 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:57.349Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:57.535Z: Cleaning up.
    Sep 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:57.645Z: Stopping worker pool...
    Sep 25, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:47:50.002Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:47:50.061Z: Worker pool stopped.
    Sep 25, 2020 6:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_11_45_20-2334548006027065858 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c9c6edab-67c8-4a3f-9e8f-8125dd2bc494 and timestamp: 2020-09-25T18:47:58.911000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    23.286

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 51.584 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/zzdpyf4jbf2ic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1038

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1038/display/redirect?page=changes>

Changes:

[noreply] Change when streaming Combine on Flink should be fired (#12931)

[Ismaël Mejía] [BEAM-10759] Uses reader Avro schema to deserialize in KafkaIO


------------------------------------------
[...truncated 273.46 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-8JnFHP8oFwlt_LJFot2yxAETnsTPktPZ_Xmt1967uA8.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Wa31o9CyBamZ24K5YU8C0xO4w-o4YEiOfXndOSzRoW4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2292524598700075263.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YS-j4eGwzi_tIRebvZj3a7-8ij1jQ6xZ2IVOS15IX6g.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-UjjvkMsF5n8UiNJ57I8-QvnvNJbzJvOzzFealA5OyxU.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-T5nbIHVcYSKxvvo_rQbfJlXWbDq_Iq7SAWfL1vbeYu4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-9RmGtv4glmz14hs9DvqLLAfvaOztlfxU76B6xUACvM4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aCnTRNm1uTWA-V7k8YL9jYXNH9GKt1g3bk7in0ZWadY.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-fxnB8qT5mDxAl1dqA-d3MRQTaPAvS-WGpR0mY8ZkmUM.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-c6JZn8Wrrm_Y9spUEJigaF2DEDPYMwntdgqz3iZTDR0.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-31akpKo2TW0Cm_6GqPI69rGEVlRxEaPwdM1_IZCy6FM.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-rK6SV9n0CQf-OcDUjMRMsVWPJdXBYjQNK8HT84V_6SI.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-8JnFHP8oFwlt_LJFot2yxAETnsTPktPZ_Xmt1967uA8.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-U81NL-SV6Ydr15FxU3rYp-bWXhkKEb9cGarfEdhmmT4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Nii5dPPKVpgZj2lvYsy2vN4XOx5zaysqg3XpNTxc4ko.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-0Yn0_zHTkYapPxW7TZeCvOr7HSp-pPAb1K33mstiUtE.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-PGWIVluDZvHpsrjIjV2ksopNinrZEVm5bj0mFBvjUkc.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Lei8q-w2a3JY32nCCgVGMhfTC55HIA81KwZjAr9Olm0.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mxDyhX6RF3C3nlRx50cyoZ6LfNeNnRT8XM4Pm1Nak-I.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-qTCAvghx5qRyVOmYld2dXrpmFOjio2PT3z2TJsq3K5s.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EtxFlMYmzl5Wmvub_a33FslWdG2wuWu-e9rV9neu77w.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-QkQDKq8eCAMpinU9EOd6UwIVXeSIz2geke4N4x05Q0Y.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ijY7AO9o4emFhq3VkHLAUe1zT6xwWJBiv77k24zL-J0.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-zBPtQahBhWypgeaDV0OsEdsUtvTziBkPUvWlGOenNEA.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-rz6j9t60DX8uISgdSp3yw6rxv81OxjUnRYY0sI2LYvY.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tenTa5wxh-hrRqqNREYR9PmPnHMwiKo4OAjpSltD1kw.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qQ4lUA8SOV04bd7kdUnxOADmo76JzXGzC7tX8aNQvWw.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ExR9PvUkKcwO1LDaHbknol3TyemYE6C-H0EerUUuFnQ.jar
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash b606bad74f67b3c3f1b00d966f103a73e71da95198a8dca2bb4e44ee3ace2780> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tga6109ns8PxsA2WbxA6c-cdqVGYqNyiu05E7jrOJ4A.pb
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_05_45_40-4874489163137861910?project=apache-beam-testing
    Sep 25, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_05_45_40-4874489163137861910
    Sep 25, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_05_45_40-4874489163137861910
    Sep 25, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T12:45:40.797Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:47.955Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.777Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.827Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.940Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.033Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.070Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.102Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.447Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.525Z: Starting 5 workers in us-central1-b...
    Sep 25, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:00.559Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:18.351Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:35.581Z: Workers have started successfully.
    Sep 25, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:35.615Z: Workers have started successfully.
    Sep 25, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:47:08.495Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:47:08.656Z: Cleaning up.
    Sep 25, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:47:08.743Z: Stopping worker pool...
    Sep 25, 2020 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:48:02.610Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:48:02.656Z: Worker pool stopped.
    Sep 25, 2020 12:48:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_05_45_40-4874489163137861910 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 40b209dc-7110-4488-aae7-b1bc92ce7f25 and timestamp: 2020-09-25T12:48:17.525000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.857

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:48:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 50.659 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/jkb2dh66hhojk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1037

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1037/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10960] Fix DropFields does not maintain the original fields order

[noreply] [BEAM-6103] Enable BQ streaming insert timeouts (#12893)


------------------------------------------
[...truncated 273.51 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-4awc4viSGQj6vDkDmtGxg717QPCKYEco98KVX7rNThk.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6501801575398705674.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-K2GVQ15aO51MIvn9fD0nS7eXkBbOFB0yr2w3PNjykHg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-QC1zp6oVCD3ibRCR5UMZlcSNtGHakD5pxm34GK73FMI.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-t8kXh3TPlXP-gQJnUI84uAMwCKMJ8u_cPl7RH4BJp3Q.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-8u-kTOWsnAS-EvK1K8JRkA3xPPMSqSpH5-t5Bv7MwYE.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-hXuEhVV9pdbwPr4YACa-6VcVtubmuQ2rZfGiyyGTXaE.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-dTFthZ50bUj0a0fQNLgmYE6j_le3kKkNgkG0LBSmh4s.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-JFNZeNPh2d089zyXejW4YPj7HO9tz9zQnOJcNPubGhA.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-zN72JffqJUH9rG2QkM_COspXLDCIAZTrCKX0f5KAb9Y.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7ZqTaUtKiAy_c5G1k09TlA6KW4bnDats129oQDJdks4.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Mdle9oMO0Bne4JosjCS5Da4lGWFcC7n2nXINs12e6iQ.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-ppaCylyo6zuMiAOio60nUZ9KyOrlLGpWJ12aBthrQnA.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4E2hCnmTkCXy7n9ZkCnzLUaw-qZcox3Wv-OZZjEl9wU.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-WwhtnYbU-90HLM4ifHs9cFiHEXhmXEEHc4pa9_oVCSg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-OlD5oEq_xxBfJec8hpooLrt2ANJRXPSFKhmyBPr0mW4.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-50EVHwjcQXlfMdvUu2cQUV5lIQTKl-fEuZTt5u1KoD0.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-J8QZItIVvcOnoLyWyzYc5y7HwDyuNzLbJf6vTQSlSuM.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-t8kXh3TPlXP-gQJnUI84uAMwCKMJ8u_cPl7RH4BJp3Q.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Qs3QkSu-V0qaCrSi7s6n2KIZMGDeTmyqeN63i0alTAg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-DB_3psvC9vzp0mRrOASHKHucpXKmx0rNFQSb0mwZGyY.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-NMY6NoDERz6eQrsbbLvGAx50SU86s113xbv9Cz6oCTA.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-inZbsR1D2W_xYdTOqnWFRr0sGclcAu3tvNxFnNFPKjg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-PfPo0XhEVEgSjs7y0UkarHiM9LcfPnt43PV1fsBYikE.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-C0jpYq4O_JKPNp-TQAuyiJ9kpZYKBzgYejVtEx1xhG4.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-hidlS6As_Lu-bunUjVKBcBKxOvenGbjBltbzJoESb2E.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-aOfJpiyGMtb4cpSFza-I9bZQW3vNDqVFSWV_GSgx1aw.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-mH865L8aFLd3XnoUfqLu8saFjLj1sh5z6xi2iPgqaDE.jar
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 84ae51a8899a7f93578ad47bd3b511094b69dfd646391522662bdbb03cfe0bc1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hK5RqImaf5NXitR707URCUtp39ZGORUiZivbsDz-C8E.pb
    Sep 25, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_23_45_38-11740137979377362187?project=apache-beam-testing
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_23_45_38-11740137979377362187
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_23_45_38-11740137979377362187
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T06:45:38.228Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.151Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.769Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.856Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.883Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.987Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.023Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.061Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.142Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.713Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.786Z: Starting 5 workers in us-central1-f...
    Sep 25, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:56.159Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:46:19.346Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:46:39.131Z: Workers have started successfully.
    Sep 25, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:46:39.159Z: Workers have started successfully.
    Sep 25, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:47:12.696Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:47:12.976Z: Cleaning up.
    Sep 25, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:47:13.118Z: Stopping worker pool...
    Sep 25, 2020 6:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:48:04.996Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 6:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:48:05.064Z: Worker pool stopped.
    Sep 25, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_23_45_38-11740137979377362187 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8a6e7baa-8aa4-4aa6-b371-db07eb2d86c2 and timestamp: 2020-09-25T06:48:13.285000000Z:
                     Metric:                    Value:
                   read_time                    14.507
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:48:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 48.698 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/akns33g7bzc5u

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1036

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1036/display/redirect?page=changes>

Changes:

[milan.cermak] Adds unit tests for the parse_table_reference function

[noreply] Merge pull request #12918 from [BEAM-10910]: Validate the BigQuery table

[noreply] [Minor] Typo/grammatical changes (#12849)


------------------------------------------
[...truncated 294.50 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:53:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:53:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 12:53:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 12:53:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 12:53:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--A7ooheg7dDDB_Whh1cloPDao7O1tkWEox1IysFraDY.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-2tbebiE0_vFtkYBUqDJelgK72nccy1qJFZuOOOSS5H8.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-bbnIN6iG9BRQ9svTYvgb-J7FmOLuRwqImsIhboZ2FMk.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-BjsLxu9Kr4mgUZleFd_JeOBFc7wCXG7BxYmQc10RQTg.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Zoemnav-llwPY2xo3Tbwm0i_ew-5TLvQpVUeQbqh6Pw.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-DQraQZmLN1YnuB8AM6tOt5J10ovIJ6SwdkzQK2h_GOg.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-JxrDDDUNK2AggHnjylAz65_O2fNp6Snthok2Tr5lv2w.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-iSoB1TbxWoCcjAXffnVAIJp-uw-UUuTLpvDNOuu1q2I.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-NvxPRYfqAcIVyrtyHy8vaOb6ntdFmDc6E-lR1DDf31o.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-LLdacHRx18nvU_deCA4mwHwld34beqVrPK5X840Azsk.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-wl9hfF5NSU1AiI74JdTAo_HkSONS_VStbWhb5VH463Q.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-BZdSUGfhbYYdzLd9OSLadHalgguONUC9wYESsf67H80.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rnAD-8eCmlhHXkgcCvhGal4TSZqgQhT9OXbwWjpXibM.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-LSn490j5MzLXYuujMwJPDhebhdxwCUL2Hym_8ZFxXyk.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-0fMY0M9_robLU45AhpZ33ZzcJL7YSwg1QV2XJsXWXi0.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Rt_p9MFwI89xWYU-1IsctLPwakxFHRbStpOuUt8yss0.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-LFaz_hhU6qL4DOB6QIJu1IudWYovAo4etTSohnCtzyU.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-uQusBpAIbMo_i-OXUVE5j-ZlnoGVeTaq29jjXi3E64Y.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-g_tC0BTMfGWoI0Dv0ruhD_nSOf03IeXtlgb-cGjTnNc.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RKmHDiZZiNtxBIPsbniTeT7u_lG436il8ByVrI5OUq0.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-UTN-nnLuGdY7E0ZIFeehHX8JY04D7ZIeJj2D7yb8aiM.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-iihFdtufQND_I9xaXadQD7eHnv2K8dEZWnDZl_dQgO4.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-co4nzRpL6AOCtx5dLxf-BJb77yLEAQ42-CV3bhOApb4.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--A7ooheg7dDDB_Whh1cloPDao7O1tkWEox1IysFraDY.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-QsQNDfjLzyiGHezPuNOwnwyMNATmwvwB8yhWaVQFGgc.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TSJ8uAJaVshFW5saUUPUp8FQp_GmqxkjgY8LkgS84eQ.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3920304788997613909.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9QaFcvF2c9I6mtrcPfbUmAzgJp4WLW42qN3pp7-TkiY.jar
    Sep 25, 2020 12:53:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-GGkvTZeNyELMKKww-FjaPBY3uuJjnI2S7oMWxgrD_q8.jar
    Sep 25, 2020 12:53:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-BeGNvit7d9EdTfYHYktkn7rE8duZpVSJ7Vhd1DhCSvc.jar
    Sep 25, 2020 12:53:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-hzcpkkk-kkiMcfh8GMc7X6I5_uq6lLDcwN_jn335P5M.jar
    Sep 25, 2020 12:53:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GlFi8hO7A0FKshT9-QJcn9gsghobCqebsvfjZmwDk2c.jar
    Sep 25, 2020 12:53:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 7 seconds
    Sep 25, 2020 12:53:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 0b38c9101dcd5a72d262cbb1455ae70c1df381b7dcbf749721a65b0dbb470364> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CzjJEB3NWnLSYsuxRVrnDB3zgbfcv3SXIaZbDbtHA2Q.pb
    Sep 25, 2020 12:53:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 12:53:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_17_53_32-3328448307700009274?project=apache-beam-testing
    Sep 25, 2020 12:53:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_17_53_32-3328448307700009274
    Sep 25, 2020 12:53:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_17_53_32-3328448307700009274
    Sep 25, 2020 12:53:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T00:53:32.404Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:42.526Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.455Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.496Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.523Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.599Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.622Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.660Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.683Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:44.157Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:44.234Z: Starting 5 workers in us-central1-b...
    Sep 25, 2020 12:54:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:58.909Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 12:54:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:54:12.144Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 12:54:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:54:31.586Z: Workers have started successfully.
    Sep 25, 2020 12:54:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:54:31.621Z: Workers have started successfully.
    Sep 25, 2020 12:55:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:05.656Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:55:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:05.795Z: Cleaning up.
    Sep 25, 2020 12:55:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:05.861Z: Stopping worker pool...
    Sep 25, 2020 12:55:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:56.232Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 12:55:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:56.270Z: Worker pool stopped.
    Sep 25, 2020 12:56:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_17_53_32-3328448307700009274 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c02f32e4-fcf0-46e7-91f5-561adda2e69e and timestamp: 2020-09-25T00:56:07.086000000Z:
                     Metric:                    Value:
                   read_time                      13.5
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:56:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.098 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.138 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 3 mins 43.007 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 4s
107 actionable tasks: 66 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/5yhkdexspccuk

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1035

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1035/display/redirect?page=changes>

Changes:

[Chad Dombrova] [BEAM-7746] Add type checking to coders

[noreply] [BEAM-9154] Ensure Chicago Taxi Example is disabled on Jenkins (#12929)

[noreply] Add to_pcollection example to wordcount_dataframe (#12923)


------------------------------------------
[...truncated 269.14 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-chMhFqMxMPZLTav2LsNvG-NNb0VCNC0_vBEfoTslyuI.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-chMhFqMxMPZLTav2LsNvG-NNb0VCNC0_vBEfoTslyuI.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-HJwRA5XbFuwgX3efzqwaGcQYGQkW6p15Np05y1_v2e4.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-DoUYwo-d6U_ip5Z97rpPyJmnzwxy_f4kq6jH-yYgcwQ.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-a3ZOG3KcWkqs9l1Sv2OG3SPngZKdF-I6XDu5c0kqNCg.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-OAhP3Xo-iOZNW-1a4XLLjr299ZNDoMCpftDzHvGddEo.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-YQB1lGe1tJtGn51EgfEl_UBtJNBLRTnnVUINjS8JISo.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-yRXcaSXaDZbrb3Km5HDX1XMp0U33qM1Sq3sOyP1oONU.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-7C0pJZER1nqmbWPsXNuYu3rDE_HAL-iZE6t3LTNwzaM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-FOkdBrllTJ2xgvMWHISzwNUWaQcde4KaVp68x5pfKmw.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-gNDe8ZPQvMedpC_Hcb_9exyITXI9-GC0NC-_arnBEdM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7tFo4ss52ukXcWWSVOWwnyvIHpD8ozGQHPOth2-HL7A.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Rz7vGBO2giWuGTivPAqN26ASnWMe9rND6RCG0dNG7p4.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7983317506118141826.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zJ_N3sgPK3g7yjjtRNECYYuZt4AWh3a7068EkxnhdzM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-kCwqakENimiFDXpyERoX33FALw0iLf2izcn0H15FkIs.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-KL-0ofa70-MML5HfNDmu0x_rGf-bk51VNYQa1NG5MSM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-blhTgj2vI4Zk-P0gksh4QYD0GapXM4-AHiN8NJT9WVg.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-irc7V0eTkKsM4pC2rtzNgcfBIXrEk3qYwEGw1JvrEPk.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-VSlnGmHEVJ3gTxaOS7sYz6RblBeFX3LPk8Mff5Ltq4E.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-mcKIokvJU8D7TEPKXG-wO-YahRhJt0DFI6EavsPqhK8.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-VWW-CkM8oME53BdaDz1OTdDh_96VyeosebgvRaKkbn4.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-bIuqYkZUtcOpF_peWV3ZvA7XmOML11YnmKuA2yZs1zo.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-KZft39E0rrbZVFbgkc7uEFLmzC6KuDQ9FanxNLsV69Y.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-uQAsQOTwC8Y4wvfp6lNu466O-gw9NljzkhzMeHdqaKY.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-j8cb1bI_X4dEfFh09m1lZ4fkK4IIj7VUjydz-VmoEYE.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-1Imlu7vkCJ0F3KMvN8_5Sd3Q7ZdnPozO_567hR1Pih0.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-dh2gGTPz0pkzSyUS61yT4sNzZXSUZhr4IKS1BhvhDlE.jar
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash bf9ce98605db7f30fe3241f009482a460fa203a54c5fd777152e6d435169ef93> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-v5zphgXbfzD-MkHwCUgqRg-iA6VMX9d3FS5tQ1Fp75M.pb
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 24, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_11_45_20-10026403084809293690?project=apache-beam-testing
    Sep 24, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_11_45_20-10026403084809293690
    Sep 24, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_11_45_20-10026403084809293690
    Sep 24, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T18:45:20.692Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.252Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.890Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.936Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.971Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.112Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.136Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.163Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.200Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.569Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.629Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:45.256Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:52.974Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:52.999Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 24, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:58.304Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:12.962Z: Workers have started successfully.
    Sep 24, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:12.995Z: Workers have started successfully.
    Sep 24, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:46.000Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:46.147Z: Cleaning up.
    Sep 24, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:46.221Z: Stopping worker pool...
    Sep 24, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:47:41.702Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:47:41.745Z: Worker pool stopped.
    Sep 24, 2020 6:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_11_45_20-10026403084809293690 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e3e92647-7e7c-41b3-aa1c-6adf914542e3 and timestamp: 2020-09-24T18:47:53.051000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.366

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.045 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 46.229 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/gnrkd6742b7k4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1034

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1034/display/redirect>

Changes:


------------------------------------------
[...truncated 269.85 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-p3fLmjuguhQsTjx0JRMaq05JkyKKsfAMvj1-x81MrBA.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Hid8dxTUQneyLAIBHlMduwmHtVlkA7gvFg15RnuY3Kk.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-96RiI30v5nb3Y_sbiXrbvHBfGSsIvexVpgrm_ErPK_A.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-x9leiPlg80HQltjz0eH7UEA8ZBDBcyNLj_N3e8_e8Qo.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-oHVoCPgMTmiUGFXFLWSNVt6NtSuXwKTovRsDka8W8qM.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-W4Rt3ZVzgBOzuQp1iVXXD4z6kZ5TMBbijiX96Pj1Uto.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-fgL5ATwPheAB7j70f4iLRCjLK3I7ZZBcUnkliA8s9OE.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-0RN0M66vTmT2mvK5vELAA5eWvA08hwtGnJ0PB2dJ21w.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-hh-Dan6wEp8pulTUCsaiQbvEex89E-podCxB2OboW-Y.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-386Lf1kumpbw8PtHy0TU5-G3blkEuaCkK3J_rpI0DDc.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-dICsQRdzzuZpzb_GdoNacpEUHCV_gXhBno9ODiEKgKI.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9023185273535501287.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xQgQASQnWY2fq_udZZiDq-s71YT6EiTWe2g4Gbf4WHA.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-p3fLmjuguhQsTjx0JRMaq05JkyKKsfAMvj1-x81MrBA.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-vpq40Ui7MsNpDhwL6Mw-KoaiuWf5Tp8pNbDSDnxe_nI.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-7EA2EpZDaHbwjIGr0tRrozn3g0JERnONsL7c8jB96BU.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-v-TGvC18fwZ8Tj5dmEPOPy0s_Ha66wDvHPY2VnalDO0.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-1-q_Yjgd4QLKha0Gfg_VU3lTG18yuSqGCPNIWUY7Rjo.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-VR9IGbx9PbkeN5_QI1b7qghOO04qUaLZlQS7Tjh7a0g.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-G3MyBfAjxEd_8sb5PUy_9b2_t6D0kImrP6x8b8dNM3o.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-IcmIqC4QRBEW209yFu2qZ58EftJ-_r2NO7fQNXikQ1Q.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-9Y0cEJP--xsLgxmnB_gRKvr7HHoPJIFDoYD4ZaZhfQk.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ujrSkklbnqtoXPV2oVSv2iHwM5cAB7g-hTaBN0_Q_vw.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-yuZHYe_OmCOHCrv6VyqDZp4IOqHsch96reccCLqE0dQ.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-7EyO6y2KUvVgILqTXgYGPIbo3UVSLRk22yDtu7tGM2Q.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-mJSITdn5JoIZNIDrrsvTzYzLzv96wghavqaFH0EY2FY.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-2RJGda5xvSKFSEeSdL5Diy3Lue3b0YDbp4VjGuQJyBw.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-3t28CJDkVJvXrhww6SyBZwLFMZTRIJig_PyhtbOYoIk.jar
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash ebfdc9d8579a1d7dd80fecbd7c8cca83826d21eaa9aba67c2aff3e1c73b9900e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6_3J2FeaHX3YD-y9fIzKg4JtIeqpq6Z8Kv8-HHO5kA4.pb
    Sep 24, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_05_45_25-4644725188128550442?project=apache-beam-testing
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_05_45_25-4644725188128550442
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_05_45_25-4644725188128550442
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T12:45:25.231Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.153Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.928Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.955Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.984Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.046Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.075Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.109Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.140Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.540Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.628Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:46.979Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:59.934Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:18.669Z: Workers have started successfully.
    Sep 24, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:18.698Z: Workers have started successfully.
    Sep 24, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:49.105Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:49.310Z: Cleaning up.
    Sep 24, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:49.408Z: Stopping worker pool...
    Sep 24, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:47:40.306Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:47:40.353Z: Worker pool stopped.
    Sep 24, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_05_45_25-4644725188128550442 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 80850963-d7d3-4d29-aa4a-87a6c7086e13 and timestamp: 2020-09-24T12:47:50.686000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.913

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 40.328 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/akwlc7di2jvzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1033

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1033/display/redirect?page=changes>

Changes:

[Robin Qiu] Moving to 2.26.0-SNAPSHOT on master branch.

[noreply] [BEAM-9616] Consolidate Element and DoFn json impl (#12925)


------------------------------------------
[...truncated 272.88 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ihebk17_TaRao07kB6MTMa0iWxvuQOwGrENFFG1dVjU.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-bd-UPIlu_vnP8gTSGpn0uD6QMeuq27opeNCWRVkL9yc.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-9Q9BVFqVrHKjfAcjmV7ZgKGwI84AMmZPVSbIUrmRtRo.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-VTTdJyXI1aGwiDRbdItkSAcvBJincyFIsxvYpqwNTOQ.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-CyJ3OwDEGwXomYY9ZRknvSJNO7UNAIx-cyTxt4-PIMk.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-SED3nJwEOZSU-iSS1PdvQiTp0EAcsZTYr30EK-zH558.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ihebk17_TaRao07kB6MTMa0iWxvuQOwGrENFFG1dVjU.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Va6dJtgowgJWZ_lE0Zyx-Mjrx3K1E-SY3SfkfwuVM-I.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-0acWQu3Hn7vOpkbuKt7Dq8jwXRhJ4ubDk0BAaynMQVA.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-cRXW5QzPhItAwovzyJzr5ZZlyiW7-qZXrCp7sNUEc6M.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-OW-SJL6H4BlsUhiPJP0Fz_ndneBmzbuNX1URJrApxUk.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-9k4fXxvF2mipXBvGjtdpN1FreBInL3f0MhUmCZdVTCk.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-QMniAQYzyc0Vz06XU-BzQ3vM4WfvPvt6sAgxAyKOxFA.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Z4UXA6HOj5DHlNJg1yHR4UAozsIOe3jf9ybgK0gUg4s.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-wtHNrogSGaCPzX65IqTI7A-KRpgxekyogz6-JCC4kpM.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-V3q_-65sUBevoMjEilp0Zi-tktKveNhcMbC_pmqTL1k.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7g2dEAj6RMwykzX2oqaUgMijamKIY87IVtYy-b862B4.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-UuhNRZi_sQyqMsgKtPaHPVHV8T7st4LriP85FXFs064.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-L1msJPOb-NtJO2kr9Hs7mVLAXW4yGEoL-ffoEi0Jycw.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-U4NKmU4i1LZL_T8l_RYs1EQPvMWVbfGdtAZu1fLfUVM.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-SdjygCR-eHpDxws2B4CLpv7ZRYW20F3aD01gam9t3PY.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TZKqXuVWCoMdcQV1KUxMXkyDJn24k_exMzoVrk_mNKw.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-8V0FNdJf0aR5g4n8xnzyl21X0lYqKSqSVXalEsyxGNE.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-PjgsuR-Sh8deibg41mP_vYoCV8PgZ0Y9LKGcfz2UaCo.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-SpQ5f-MgjsxKXA6x6Ul8g1wEdgzKgWfw9TRjxs_yFQE.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test317891621245546609.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zjkflGQA8lrSIZwTbwThX9adovRO2LwZbpTB-wgX6Ho.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-HJNJm2DgDo-jDjcXopC0CWJXet3pNtsu3ib7PhWPn-4.jar
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95913 bytes, hash 02e73f80f59c863dcf4e071daed4ca4211d43f0b84fd440eaef0ab15cbe309eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Auc_gPWchj3PTgcdrtTKQhHUPwuE_UQOrvCrFcvjCes.pb
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 24, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_23_45_33-4151554466615201999?project=apache-beam-testing
    Sep 24, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_23_45_33-4151554466615201999
    Sep 24, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_23_45_33-4151554466615201999
    Sep 24, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T06:45:33.661Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:46.033Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.314Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.359Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.389Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.575Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.870Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.973Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:48.159Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:49.566Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:49.666Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:58.465Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:46:16.761Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:46:37.350Z: Workers have started successfully.
    Sep 24, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:46:37.392Z: Workers have started successfully.
    Sep 24, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:47:08.575Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:47:09.213Z: Cleaning up.
    Sep 24, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:47:09.289Z: Stopping worker pool...
    Sep 24, 2020 6:48:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:48:44.361Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 6:48:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:48:44.405Z: Worker pool stopped.
    Sep 24, 2020 6:48:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_23_45_33-4151554466615201999 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b6603a0c-638b-4044-bdaf-a5a6a9afae85 and timestamp: 2020-09-24T06:48:52.121000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     12.57

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:48:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 31.261 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 35s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/azk2sf3iy5p7y

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1032

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1032/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-10861] Adds URNs and payloads to PubSub transforms to

[Valentyn Tymofieiev] [BEAM-9372][BEAM-7372] Remove Py2 and Py35 test suites.

[Valentyn Tymofieiev] [BEAM-9372][BEAM-8371] Sunset Python 2 and Python 3.5 support in Apache

[Valentyn Tymofieiev] [BEAM-9372][BEAM-7372] Clean release script and correct naming pattern

[srohde] Fix BEAM-10956

[noreply] [BEAM-10586] Remove Python 2.7 and Python 3.5 support in Dataflow

[noreply] Merge pull request #12912 from [BEAM-10938] Adds support for writing a

[sambvfx] [BEAM-8660] Use PortableOptions.artifact_endpoint if provided over

[sambvfx] Add artifact_endpoint to TestJobServicePlan

[sambvfx] Add simple test for artifact_endpoint

[noreply] [BEAM-9547] Roll forward #12858 (#12920)

[noreply] Merge pull request #12782 from [BEAM-10950] Overriding Dataflow Native

[noreply] [BEAM-10844] Add experiment option prebuild_sdk_container to prebuild


------------------------------------------
[...truncated 282.16 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 12:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-4XXipBPbqYkasCgTgYyyTFgLC4H5WTMJk9ILZI6E10I.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-PU21hyoB053NBjHcorM-7tGXVqVWc15OU0y1DmKszLY.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-jzGfkWmNjOQR1ExicMtkA1m1vfZFvwtU89CcyvIXmDk.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-3Xks6sO606B_hCoTHSGQ1f-oh3vW9T8qjCLLLCxcbjk.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-5hVGWqs6qiYrc4hfxjwtVLxAOznT8up4RUYjlFVLjGg.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Ea5Ad82bf8RjS7JkoCgY-Atjynk1ClAE7izyqpGA2OA.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-6gBYtmvF2SSHSlEZ75exRPk0qgRLM6P4mxC3i63tqrU.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-kPNEg6UhDFi85NgL5SR6lr-AIvsiq16y0AJcg2qfYho.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EBkvxq3zjlfQaJimBYzSOot79dyosiuUZnSEvS1kNrk.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-OSlIvyl8x9Sx6qlcvcl4fV1f780tzI2TCw27VZNQcc8.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ipcthZIr1EcYbPTbM72jk9bzyJVngCJRnuTuZhXdWm0.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-3Vo6I6_5xNcWLHZj6o7cfOhnyIF051iuY6JOzSDyQkU.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9ye2T-rD49YqzE9zsxPCUaukoDw0AIY8GjMJ1FNOCPA.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XpC3WCdde3NxXVWlRnXN20ELwr0YhmRqGQ8GjS_FweM.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-RURAHSZXcwVbxnCKJdbkzPU96FAr7YL-g1fvy9zilik.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-4vN8y6u0Nrd_iUKnlglu3HwQss5EkDausS0DrEXcNRw.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-DcFRgTVK9wCKQUrBDmNo3UVvBqZy-Pp0G2u8rTr38Ds.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8574567814791560331.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zchzAgee1ozf61jv4CQP4E5ayW7iP1MSRNpn5y8CuYY.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-igyMNAvjxh8vtrslKtGSylCxU-8rZGHaCdnQoFZsgPo.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-4XXipBPbqYkasCgTgYyyTFgLC4H5WTMJk9ILZI6E10I.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vgNQeqZcpJ2qJzobze4IBbne5dHM-FWdIyYwV-WXGcI.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-PUiTdvBdYLrEWSNBOIL1EfvAwvC5BBmocelkdENSGuE.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-6T4_bFH9Z5APkTsxBFTrfjRYeRXOUVsu9vLnRNRuwbw.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-aGr0Gfxh9WDkCRCXZmCGT-DQcTnVnpqfzFo2Ap0M1GE.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Ztf5O4IHJ6lghO4sE58u-lUDBdT4bxA92uTFtrgfE-8.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-GROsk9Ho2JGwBX5m4luL6k39PYV1FgiLXdMWIRGF7eI.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ePkJiJstM-ypfNHINUcW1YsMzCYBhWCmQCmgqdtQibI.jar
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 7ac8ff65a6dd3371ee4545dfa64d2f214306d087254c91d66f101a085a54a51a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-esj_ZabdM3HuRUXfpk0vIUMG0IclTJHWbxAaCFpUpRo.pb
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_17_45_48-17388493341644479826?project=apache-beam-testing
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_17_45_48-17388493341644479826
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_17_45_48-17388493341644479826
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T00:45:48.815Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:55.722Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.362Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.404Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.438Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.514Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.546Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.579Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.666Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:57.037Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:57.126Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:12.681Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:21.439Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:44.159Z: Workers have started successfully.
    Sep 24, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:44.193Z: Workers have started successfully.
    Sep 24, 2020 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:47:19.837Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:47:20.002Z: Cleaning up.
    Sep 24, 2020 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:47:20.103Z: Stopping worker pool...
    Sep 24, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:48:12.065Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:48:12.108Z: Worker pool stopped.
    Sep 24, 2020 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_17_45_48-17388493341644479826 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0e76bc9f-7bc7-4f39-a5df-e27b01e5f573 and timestamp: 2020-09-24T00:48:20.118000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.468

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
    Sep 24, 2020 12:48:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.053 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.072 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 46.2 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/hl3fux3i47v2u

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1031

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1031/display/redirect?page=changes>

Changes:

[srohde] Add the ib.recordings API

[srohde] Fix macos IB recordings test

[srohde] Delete RM reference from environment when evicted

[srohde] Move pipeline_var into RM constructor

[aromanenko.dev] [BEAM-2546] Update CHANGES.md and add some checks


------------------------------------------
[...truncated 273.62 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 6:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-tgGg0H49ItA83tCtqME72oyUd9MFRqLimgf9yJCY5V4.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-ykpfgzSpm_ycK9xNl2aKbbKwASCg1gdZ4OX25HNadxM.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-TM-1yYXNW-XxhI4bHVq8Uu2dviAvCO2hHU-pu2UNKeE.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-T5hs_sxY5YJiH0VJEm0nNJ-LVTkMGIEJsOhi-0Ogagc.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-asGXDQ2WADVAkMTAvHflX3YCYHXabhDpryqllU4AT30.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-XMu_8wA4byr6jCRmmuyLPBH8z27GLDh4TyiuZKfY9bU.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ylakBT9SnBZHCJPqzbgzFfX4RZp1kkWutSRXJuIueVM.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-q3bbbBbfxZbfg6GMVi7YGEUqg7pJ5z5JRAMcq_9BPxI.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-jUXYncsmM7g9zp5cxst7my3B6uwV_qZiAFfX3fNEjkE.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ZXD8b1woDPrd_QeqoI7sXG9FdqYHP6m_Ze9K4Ghg_xc.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mOEbUi8Ksum2f6-M-p5PFx8RKcOkrT6dK33SiOWF6_4.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-QCPZbwFngVcj6ofZVGKA94RFt2SrSsTdSP5nm0h9XyE.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-edfC4qdnH4Iukaxow9RLABQal20EVC0sfnXOSMWgmb0.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-tgGg0H49ItA83tCtqME72oyUd9MFRqLimgf9yJCY5V4.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-9kLD4KLD7FXBeOfbxKikd98LoC9uC20AiEwfbQ5Vbq8.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6455771520532046824.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JlLCWTKgFaqrc5Lwg3WG_J9L_XA6DaZb-_ti2ihNpVY.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-cCnOaXVMTPgnQbN1wt-j_wvdURTXUKqxVs-neiqL3Gg.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-iIwTdDONzLaMGAlE9ytnTzNLYDnX05_HIhoNVHhrDOI.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-P00sGi_Gn-L7FRi8zlMXJsx1j22AnM5r7iQxiK0Jm1o.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-RPmNqGGq9RmWTrt4UrrVNW5VWSSusdSP6uPZqhZT_YI.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-sGS49OO1zCNknXQoD7toIgOi5-3q78FlfK7IJMmF9ao.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-X_GIuX0-2ajOGn6P29egAAuvOWI7nUAX8xT4bHqLMPw.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Jeh66QJ14e86zaHT7kQ-mlqnr5yGH1KWkfMDMmgDRcQ.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Ny36a_afRM0HXC_oRQ1xXhWWxo81_WuORJKMCY3YKjM.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GrDZodBGsKVKICopI8iZCvwgT2Cw3ej4oozntRTvNgg.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-pC_y-gb9AdeRP31s-sT3m1aj-RmARo3oLw0EQJIxPzU.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-tyYHE2pGpTfOkyNEtJm6B-_B7Dq6MyVepWRAxxJTb_U.jar
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 5db7a36e7f24a284a98e6009e5ac0e15a63d7b46a6a9c5b68b6938f6e5a9132a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xbejbn8kooSpjmAJ5awOFaY9e0amqcW2i2k49uWpEyo.pb
    Sep 23, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_11_45_47-9177616685045180341?project=apache-beam-testing
    Sep 23, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_11_45_47-9177616685045180341
    Sep 23, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_11_45_47-9177616685045180341
    Sep 23, 2020 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T18:45:47.145Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:55.488Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.082Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.124Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.155Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.234Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.267Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.315Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.339Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.847Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.928Z: Starting 5 workers in us-central1-b...
    Sep 23, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:15.333Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:24.059Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:45.238Z: Workers have started successfully.
    Sep 23, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:45.269Z: Workers have started successfully.
    Sep 23, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:47:18.607Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:47:18.777Z: Cleaning up.
    Sep 23, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:47:18.859Z: Stopping worker pool...
    Sep 23, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:48:07.849Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:48:07.896Z: Worker pool stopped.
    Sep 23, 2020 6:48:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_11_45_47-9177616685045180341 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5fb96e28-b024-43b3-9e90-d971b7463588 and timestamp: 2020-09-23T18:48:18.971000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.291

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:48:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 46.96 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/bxmsismad36qy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1030

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1030/display/redirect>

Changes:


------------------------------------------
[...truncated 269.61 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-7tBzDPKF4a9eLwHownqeN8GackD_nj0OXHk1vkEfoDE.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-O2Ork-erTFjhx9LWFEAsi9rjAc2YtM5ilCL8nTxMqmQ.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jepR-uA9dSUp00X60FrVUqcwwJEQPzROlPffOPoXbvE.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test960500431479145301.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-uULHcKVbyivY1vNmkLRxSyu8aBnnFPYFb-JNCPhsQaQ.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-c6eECWDJ4387XZwlzd5HGMJsrLm2z0CJeVo2oGeFauY.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ysn0EfN2O9vtncOV5IqdDub_tUOOxSOYl6XeiVu_ZY4.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-dEN5sSoliby8lbhgcw2Vx_JCHrLNPvr0lb48CIoD-TM.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-NnnyRbKA8OzXia1D5LMOeVlTLEDI-Q6ddDnfFdtfIa8.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-vOjPjAw_BT1AGEfvy3eMoQ736QZ1G5d2vZH5HnyRIGM.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-LIok_62TaLjesANjiThUGH2G1Xo3IXJy12Qtww3LVwg.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-QAbzCSJn23fSHGajwCPYMAP_r3fXg2HnyAXNv7gw5iY.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-J7vXyRLCbIohmnKKYEGIB0mSZIhFog_0Hw6QGTpzSvk.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-4a9FqzKvIhhpczmTXFraCCwhwHq7BJJusP6SutM9B0w.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-5ErGbYLaxcfkOeaMEV9Lf4z8WVY-lRNfgmfQZDKg--E.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-FCwu1FWM_cjRGNnMr8Ig18lgoE4p3oSzxrJCRt0Pf9g.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-k5mdUTmPz5L-n1CUmDU3xfGb7DFT3rciL60Lt79qVPs.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-7tBzDPKF4a9eLwHownqeN8GackD_nj0OXHk1vkEfoDE.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests--Zmm9mNHxoniy7NsQAgnKcY5Baf_llmrl4fbhRyhyio.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RpygISfPbs0_nex5Ao47eHP7Rsgp5zRHCQBL5qJ4Sm0.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-lhltfCSi2Iqz9OjcNjsFdb8FlOr-N_LLI6rnf5_PBIw.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-qHswIQZnOCwQtt_jFGGxbAHa38Sza1OLoz9743idnE4.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-RSkMtsrUd5qeZZ7Gl_iVrYvq3wLlsBQZ9MjgTsNjAQI.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-fR_pfhch9hjwfqSquKaVxpuBFKV7nIFQ7RBM1OqBSUI.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-fnGPkVFOSm494w-RCTtP35YpZ0Ts6Wxcy24D-g4h3uc.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-blpzpfdshrrCRswKpcByWKgz3p7jy3O2MyZzVmb-aK0.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kr2qsxcHilOqa8rFivnRVMlhNK6emeJyDEvv1aOjRqg.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_kJUKEetOd_W6HZL5VL7am7AvcbNqYVac1OHLvQg7Io.jar
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95913 bytes, hash 4bd1cacc0f3416ca4120848c42736b58b91cd4c92e3d85601d648baf121de150> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-S9HKzA80FspBIISMQnNrWLkc1MkuPYVgHWSLrxId4VA.pb
    Sep 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_05_45_19-13033379796155648559?project=apache-beam-testing
    Sep 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_05_45_19-13033379796155648559
    Sep 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_05_45_19-13033379796155648559
    Sep 23, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T12:45:19.365Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:28.316Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.517Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.591Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.609Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.694Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.724Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.759Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.788Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:30.148Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:30.226Z: Starting 5 workers in us-central1-f...
    Sep 23, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:55.717Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:56.544Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:16.754Z: Workers have started successfully.
    Sep 23, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:16.794Z: Workers have started successfully.
    Sep 23, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:52.635Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:52.771Z: Cleaning up.
    Sep 23, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:52.842Z: Stopping worker pool...
    Sep 23, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:47:39.545Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:47:39.586Z: Worker pool stopped.
    Sep 23, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_05_45_19-13033379796155648559 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 231bfa9e-2f18-4aaf-bdfc-54b84d728c1f and timestamp: 2020-09-23T12:47:50.077000000Z:
                     Metric:                    Value:
                   read_time                    15.226
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 44.415 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/prjgedem44u6m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1029

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1029/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-9547] Raise NotImplementedError and WontImplementError

[noreply] [BEAM-7372] Drop Python 2 shims and update docstring in

[noreply] [BEAM-10769] Clarify Avro IO documentation. (#12638)

[noreply] [BEAM-10814][BEAM-10570] DataframeTransform outputs elements (#12882)

[noreply] [BEAM-10871] Fix FhirLROIT tests (again) (#12908)


------------------------------------------
[...truncated 269.22 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UidrXTNXUjqQQ9bXw7RnutXVdh3GVU0-OAbtkTE_DRA.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6v_yX2nKxuhe-titZqRzS4CZPFbEi1IWgt2y9JMenqg.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--JaEGVLQs8M9aQiRTfz06P6ERlih-QsY1NlrjB2aqh0.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-M1TW7i9HDSS1wK5WPi50PuqrxyIcYJD44TGqmnmNU0w.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-uQxLDq5lYydM_vpfD1bQGoP6Ahim_JnuEoCfBKnIInk.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-a5-YatShy154gE7vk2Qc3tYYGS_3x8kYDSOjBivEP7U.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-eujvq5wl82jWbUbgLp_JekIo7r9-EtU78L7yNczqj80.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-8nVfmJVnEHUJ8SodFYD0fZ_GXgbxaz0GyVe6xyqRKgo.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Nsu-nJ6pCUVT3aRbxfuKCYgh2nKqIugygrZRcprHISM.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-1SppRaa4G_OxFErdHYleB2p4oHbkr-MpuBnjQeHS2s0.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-mxzHv9r8ptWsxVJ_rVfGLV7-0qcdgz7c2cn8TOWKw4w.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HCn6yKHHL_DY0vShljjnONefDJSZ5yiWC2kW-3J3roQ.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-56P9dgbR3AHiqGSmmQZDTFrPazViEIt6Ydxv9yZAn_4.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cLS8FC_qEoZ_Ozye-oynsstTax4N2XkOso2iYwRHdTI.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Qt0omtQacMfLiXev3zw5JExVUxt0Wp_7qzhfZMYYgew.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-qVPhCGPvYJGIbrwakoaGMJYG8ZIg4URgD6iklathKm8.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-fT_LipMc4Dc5xqdCxkhJq0_bPqJy2Ev2EkLxmW-ZKP8.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-qpD9DA1iHA7dfdaOwYRVAEmrVYnohYE2Pk6T_4SFK6U.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5085017616897959886.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-nYhBzK8chwQDhgNpj4c-b-Zs4u0Tr9B46BS56skNzoU.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-i1hfUZoB-wbz6sqMb_JOqZqGxHg1vP2ff_VZ7TRzKmU.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-9M7CrhpeIQyHDJIJWkC8uH_uONbo-ShJKZErJIiGbKk.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-WZrlCxZYbytWMxqzcwWF0b20GxiP1MrrcRSDwhPU4xY.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-nmm9Dz-29pYWXTP0wWr7VaFbBH1Yg6U2xuiSycDKK1I.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UidrXTNXUjqQQ9bXw7RnutXVdh3GVU0-OAbtkTE_DRA.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-8nT3fw4t6NA7uQqdeRTeQ4eNV_AOfoownwq4qUZHjFk.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-G9cgmOfroZ91vdSiQsf6N2h2lNoFaGyt64I4OYMwa-k.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PYGa99XUl0a3PejVDJJKzZWRbYmqJWpVBfpPNdcX7zc.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 12704a5ad4de22d65d9ebd49eed1808724784cf3200f6e27aaaa3b73e30a37a6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EnBKWtTeItZdnr1J7tGAhyR4TPMgD24nqqo7c-MKN6Y.pb
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_23_45_20-2325826040322952474?project=apache-beam-testing
    Sep 23, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_23_45_20-2325826040322952474
    Sep 23, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_23_45_20-2325826040322952474
    Sep 23, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T06:45:20.621Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:29.264Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.023Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.089Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.134Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.220Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.259Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.294Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.324Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.878Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.960Z: Starting 5 workers in us-central1-b...
    Sep 23, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:58.976Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:59.511Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:17.413Z: Workers have started successfully.
    Sep 23, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:17.466Z: Workers have started successfully.
    Sep 23, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:49.481Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:49.876Z: Cleaning up.
    Sep 23, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:50.079Z: Stopping worker pool...
    Sep 23, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:47:41.525Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:47:41.579Z: Worker pool stopped.
    Sep 23, 2020 6:47:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_23_45_20-2325826040322952474 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 83e6287f-7041-48b4-b21b-85a85a10074c and timestamp: 2020-09-23T06:47:48.767000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.608

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:47:49 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 43.238 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/p2y5eyxinukgy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1028

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1028/display/redirect?page=changes>

Changes:

[MATTHEW.Ouyang] [BEAM-10532] convert NUMERIC field in TableSchema

[Kyle Weaver] [BEAM-10768] Sleep in time-based flush test to ensure correct ordering.

[Kyle Weaver] Move ZetaSQL UDF tests into separate class.

[chamikaramj] Sets sdk_harness_container_images property for all UW jobs

[Kyle Weaver] Move testCreateFunctionNoSelectThrowsException into ZetaSqlUdfTest.

[Kyle Weaver] Add comment explaining flush callbacks.

[noreply] [BEAM-10894] Support for more pandas formats. (#12844)

[noreply] [BEAM-9547] Raise NotImplementedError and WontImplementError throughout

[noreply] [BEAM-4091] Pass type hints in ptransform_fn (#9907)

[noreply] [BEAM-10716] TestPubsub/TestPubsubSignal clean up subscriptions (#12830)


------------------------------------------
[...truncated 271.68 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HuInn11HjBtS5NTYk9PbUW2qd0kTIdvEoNRbU4gx1Tk.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-2ZyeHhzUxWjP8BZgZkhrr4Km4QYr_NMBZYnxqtTiJCA.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HmPZNyCTvXyg5jeneJO_Zxd2M5yf5LcNb5SCr6aOPbw.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mSpXSA353AwY4brJB67llOxJULvvCamft7-Sb6NF5uc.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-J-CFivQ93YVOkA3JbyLrBTqxVxKHAfHz85WNRz5o9Hs.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-SaoHghgpNWLNHg_0HUuKhKGjoctBi3kHMYOgKz6bC4g.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vsvkWVhzUNjEnnXbaWDn6DkmU24XjrkiCeYEYMAKlcY.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fLKFyHlyGeBQbPIWY2eOWoERuLU-F0KpThOjq0hlQu8.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-anRG4bNAlZJ9LO1pR4R-oFKsXsde-JiRAnAOgZlWl1E.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4207008138718935195.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EjyfTIpojWnXkxru_yJ81mNts1Pm-95QZKL0hsqF88I.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-02hOU4dDfiEMlsGsx12KTlzhFMXjrNY-OYOwkjkOUQ4.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EqH0PcN9405t3iSjkTSQxuNiTJIRGA_DSylEHnyflyo.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-baSQ1Y4hIxLzIzHwmUo7PdusymKJn0H3YoK_BayiLw0.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-88q6hUw_ffSAaL72UB6b-oXO7OZoNHJ8roeB2s-6Gik.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-GPIKdYwqAiPqtz11AC3sVYVETa6VrpJSsR6t5eptgjg.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-jyi4-vGr-O-xfC5Aj4d8JyC3oTDRM9cTAIAz2w_lk1M.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-5nnGK7HhaIEsq9VyPBea8GOz2y9l3zDc1kc5Tl0fi38.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-IwG7yFyDYuo43wJM2edbp_fqOIB0d6HY69EpHAp6BWo.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-vILBY0IOSHXwq09IYdPGpvgeZhnJgxRGY3HtHwFo978.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ySgO2mqHMivAwsb3pIF72xA00_Lv3R9H_dTbVnDKtQw.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HuInn11HjBtS5NTYk9PbUW2qd0kTIdvEoNRbU4gx1Tk.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-_e1L1CcAiQmYBCtoRZZQ2lxpH-77ceLARuIu5g3_yQE.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ysL5voD6gMwgOpFU1P7JJ0-kgstbErMYpGWmn0XSv2c.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Y_z9YBs_JUNc6FwpJP8KzYhhzB87TAVPGA1-SyKsK4c.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-WDfZ9PHTvV-oEgMuVblJefo9npyZtDQ9e7SKEF3mmeA.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-cV7lusKnm0MUIyjJUE6gk4lhjUTDl3htm-hphYv1ayc.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-jRO7MHXAZLP96sVJ35veIFR2AGKWeyfEtCVophDOeCM.jar
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 5b7378cf1cb8d4d28812b549b6093145d2db5e1f8741fdb38abf3e4b7ef7c9df> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-W3N4zxy41NKIErVJtgkxRdLbXh-HQf2zir8-S373yd8.pb
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_17_45_37-12309088467168574468?project=apache-beam-testing
    Sep 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_17_45_37-12309088467168574468
    Sep 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_17_45_37-12309088467168574468
    Sep 23, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T00:45:37.624Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:45.686Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.220Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.329Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.405Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.484Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.507Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.561Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.581Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:47.321Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:47.430Z: Starting 5 workers in us-central1-b...
    Sep 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:05.005Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:18.165Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:37.236Z: Workers have started successfully.
    Sep 23, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:37.267Z: Workers have started successfully.
    Sep 23, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:47:10.258Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:47:10.516Z: Cleaning up.
    Sep 23, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:47:10.680Z: Stopping worker pool...
    Sep 23, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:48:08.883Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:48:08.936Z: Worker pool stopped.
    Sep 23, 2020 12:48:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_17_45_37-12309088467168574468 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 97c4ae8a-b803-4cfa-b564-096dd07ad0b9 and timestamp: 2020-09-23T00:48:16.875000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.221

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:48:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 54.413 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 63 executed, 44 from cache

Publishing build scan...
https://gradle.com/s/lscmfg7nvyqhk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1027

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1027/display/redirect?page=changes>

Changes:

[srohde] Add record_pipeline and clear to RM and fix duration limiter

[srohde] Add comments to RecordingManager

[Kenneth Knowles] Upgrade GCS IO to 2.1.5 and Google OAuth to 1.31.0

[noreply] [BEAM-10871] Fix FhirLROIT tests (#12902)

[noreply] [BEAM-2546] Add InfluxDbIO (#11459)

[noreply] [BEAM-9680] Add Aggregation Min and Max lessons to Go SDK katas (#12861)

[noreply] Merge pull request #12762 from [BEAM-10948] Ensuring that BigQuery jobs

[noreply] Merge pull request #12900 from [BEAM-10941] Use standard sharding


------------------------------------------
[...truncated 284.17 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ooh0YdkglWSusU5ON1p6tHH0uE86JvAJN89Flh5sVNQ.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-bHC3ZNwhRDrw3rd7CBnWhkf6_gO1Jrw66l8Q9UNYEqs.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-311YmK3zYKNAbLS4VcZlunRJsNQ34ck8NBCYe8iM1_E.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-l7XM8XcPwWCpV-tJNNQSAdfJpvDw--vYfRbJYRz4hoM.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-zyysBM21P72j2XPPYuRh_j2wcGdKJXSwrtmDQe8GI94.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ooh0YdkglWSusU5ON1p6tHH0uE86JvAJN89Flh5sVNQ.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mMiWP_Maoe19IoDGXyhe0Epx-qUz43TF_xiREIzsEOc.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5OBRzo2CFsOPEhIlPru0RMMPIfIUdE5bc1Gi4wBozd8.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-enxgzLhRBtHLFoljY5xheKNShA3wbt6LxpoM_OTzY9s.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-S1LbWrZZZMoGPyburKne6WOTRgB-MWztJytjjPQgMl8.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-8n5XLDYLDgB3b7SCrHsGKshwC2Xo7Zt9Y61jMo3EWqM.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9191983806847350017.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-S8b5Sn4qljGfxh9r--lL2M56YsVvaF75nd51luYn1sE.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-PITwZ8JOx-3N5yf5knsj-G8pnackcDao6cqe-2-GyLY.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-wMsh2US9FCGgK56oeei2Ysaqdeqbe0QFWweUr4kzpok.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-LYR6dmf5FDNYmfR2-dr7xT3_-OYVVvTqnBRjiGBYfpM.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-SvJAYRVVT-5THd-G1rc0vLp2vEVnxy7DV2IXXjdBY3Q.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-D5T1WAjhNbtHdAMpIK34dWwKOUwXCwKTosMXHEbsiJU.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-zFB36ixE1pjnoQDIMj9KViiDqanb0FKrt1p5OqPkHZU.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-oOsDvTbcSIbPh8Jpr0Hdlo5_cnZ6B8SPcz4n2HX6Jy4.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-if7gjSxCvW_C-piDJQQr1YF917dkIoDBRaC0dh54WjM.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-itcgPEAZz4sRRHfqAZBMep5VtuVIaE8eRkufByQtax8.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-4_dDNw27TkSF1miuYY9YLoeGdjhLa8SBz9T9Kw0ru24.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-_mCVuKrhLMObVpTLYJ8irJpatHR2lurbsiT-MT8myFQ.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-NH7-bHqJOiqZFqsfizj7V2WqiebmtmIywf1CWxdzCcs.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-dgZ5DWrGj_dpCl4dmbUo9x8RvuFTDoHiul_QZ6yEnw4.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-eqVScTn974sbpdrNxntQdGJO2yrA6SHRTnrr7hV3h9s.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-WiXLrFdpvUHINCB1DFGzVMvFNQ0ZhDRfNhBgIptoHoY.jar
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 2 seconds
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95917 bytes, hash 4ab6dd47606d7f14c4e61ce938b059086ef4edccf2e46867435eba9e168bc4da> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SrbdR2BtfxTE5hzpOLBZCG707czy5GhnQ166nhaLxNo.pb
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 6:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_11_49_14-18164303575540618902?project=apache-beam-testing
    Sep 22, 2020 6:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_11_49_14-18164303575540618902
    Sep 22, 2020 6:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_11_49_14-18164303575540618902
    Sep 22, 2020 6:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T18:49:14.030Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.014Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.804Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.856Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.885Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.022Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.051Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.077Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.113Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 6:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.600Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.679Z: Starting 5 workers in us-central1-b...
    Sep 22, 2020 6:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:31.134Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:53.267Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:11.270Z: Workers have started successfully.
    Sep 22, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:11.300Z: Workers have started successfully.
    Sep 22, 2020 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:46.061Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:46.266Z: Cleaning up.
    Sep 22, 2020 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:46.359Z: Stopping worker pool...
    Sep 22, 2020 6:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:51:38.087Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 6:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:51:38.131Z: Worker pool stopped.
    Sep 22, 2020 6:51:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_11_49_14-18164303575540618902 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 45fb1d89-ebcc-40c3-b9a6-7e1820ee8414 and timestamp: 2020-09-22T18:51:46.978000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.149

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:51:47 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 54.272 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 10s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/xno52dh363boi

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1026

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1026/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-9898] Add stub with imports to apache_beam.io.snowflake

[noreply] [BEAM-10916] Remove experimental annotations for BQ storage API source


------------------------------------------
[...truncated 273.40 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-MFuspS7gr09U1LeNzenqwyFfVpIT9ijJzKKjlmPIHEQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3xdgD14D8e1bmBzwvpQ62D_rjQ-ClI2M_1EJ0B4xn0M.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-vMdvWMJcBnZ8twyOYYSyGq18RUoiO4AEDQURJQVZqNM.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ommkk_FcQd_3ljeZ1SR6rAvwr21NPfyD8l9zj8hyp-o.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-YAjGgpX-Y24Oudevwv8bFUv209wYa9zWN2P9ifu24E8.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EYdnXbIaSRLxy6gSDEeBm9Rdr97CfUbLDIezZW3W_Q0.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-k5zmcUpkC1Da7Dl4aixZBnJ3SOwZA229OL6cNVUJk8g.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-yastOkDpOm53T-uXHSI4TDjTUZM_MUTeFINCHM2cIVc.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ommkk_FcQd_3ljeZ1SR6rAvwr21NPfyD8l9zj8hyp-o.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-T-rZbsnGpTb6sY_VYyYOnTDBEbI2zyNcvnPx2mx4-dc.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-c4F_Ith1wgb8iz9fT1pqJ1uppQM-YA89otjaYhrKBUM.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-c9TOwdFfFiRyaONv2gaqbsWv2gKPRuomnAsW8wBHCG4.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-NaE1iWtK7wyF_9MtogBJaDP2cDmvkN7-ELy2GBIdaLQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-W4GJOYbVo5RIadIeKLz8RDHWebVBHROTFtSSndMbRv8.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-4Ubi5yD6uNGCmuzLTm3cohwacTbojNVMq4g9RkLna0E.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jOQJiwe9g5Gbuzoo0MBL3jZZ-N2YA8HmVwnieu4c01E.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-19Cuy7OCRoYLj07BXu5x93rkH1gMK-981A0xNvJ58Y4.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-UEqx7pKcucL7Oirph1hbmx-Ew-z0o5fvW-HOEUP_3BQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-KlEFZL0iZo1va_eF5y7GyFPlBYsUdmUv3NftqONTRCk.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-dgKlF_mt2iEaOJ689S7Dr3SKQI3WXKbVPx4AqY7pvY0.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ShhRTHfSV1_HrL_BBquXA0-JWnzmibbvbaCzhPTq9yQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7448176562116324052.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-L32Gy0Pw3X5TNFcf3nFpHrlQhk4Zamr5i3CJCtUse4g.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-22h8rTF7OJeQYKtYKZdLYeUqULQnaclb1DvZAFh6FDU.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-ZY7cnM_NmutL3eUq4xV4vypEwTc3WnXarZJz_AbxKsA.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-I23XE8_OCq6dBuqGvJNxHPraEHSTvV_hgfYaqHbsZMc.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-B7M88wo2ZSgKD3QdonkXnm2Q_LfG-aJ6XO2SH-X7fXk.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-cr5bW0LGM8ArnW6dMwco0U5TAu8f1K_cUJgPQ6yD3hA.jar
    Sep 22, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 22, 2020 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 542c406eabd851b8eff11e77c1319d5ac5d14180a16e2433920b80e94d9fa75d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VCxAbqvYUbjv8R53wTGdWsXRQYChbiQzkguA6U2fp10.pb
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_05_45_53-5973426972305216413?project=apache-beam-testing
    Sep 22, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_05_45_53-5973426972305216413
    Sep 22, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_05_45_53-5973426972305216413
    Sep 22, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T12:45:53.456Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.026Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.759Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.807Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.842Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.920Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.968Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.002Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.049Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.529Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.616Z: Starting 5 workers in us-central1-f...
    Sep 22, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:31.728Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:31.767Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 22, 2020 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:35.280Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:37.109Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:59.338Z: Workers have started successfully.
    Sep 22, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:59.379Z: Workers have started successfully.
    Sep 22, 2020 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:47:29.242Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:47:29.380Z: Cleaning up.
    Sep 22, 2020 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:47:29.484Z: Stopping worker pool...
    Sep 22, 2020 12:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:48:27.380Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 12:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:48:27.424Z: Worker pool stopped.
    Sep 22, 2020 12:48:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_05_45_53-5973426972305216413 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e1358db6-e85f-4814-9c54-a014675fc354 and timestamp: 2020-09-22T12:48:36.115000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     11.89

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:48:36 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.154 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 58.468 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 19s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/7seyb5vpgcaie

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1025

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1025/display/redirect?page=changes>

Changes:

[Chad Dombrova] [BEAM-7746] Add type checking to runners.pipeline_context

[chamikaramj] Performs Dataflow specific pipeline updates before creating the pipeine

[noreply] [BEAM-9372][BEAM-9980] Makes the Python version in Flink VR suite

[noreply] [BEAM-7372][BEAM-9980] Cleans up Flink precommit VR suite definition and


------------------------------------------
[...truncated 273.48 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-B0FcwKXYt5gXMBG8PMdwhG3dJZsol0tShHgxhhwPpVQ.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cCXB9vb1LUB2kEGz5QTT9cd75wGWA9phiDoiVEvaPhY.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-2yDd7vOAwrPECFworp8CyxudCpvFUTy_piAYZVnc6d8.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-8aGNbFSR676Ub-rOUCiJLRVAYVu6V0OeGSi1VFU_gS4.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ISxaJTsKYP9C97tw7v05QFrjEtHVYOzsJbV2a3tOSQE.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-B0FcwKXYt5gXMBG8PMdwhG3dJZsol0tShHgxhhwPpVQ.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-GQaZNdb6rjVsX5l2xRKsDReidLvSW4uM8vVjc5hP1pc.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-pCPqoe1M8n7rpebjKqq0-UjBBWiDpjcnr36yoDkho9Q.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-kaj_m0ERIktVmsAT9cuewCtsJbY79lcUejEmqEBIGjw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-UWc04qPcgiOjzvC0GrUeO27YMhvjueVwpcF4A975m3o.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-7YtuyWLAV6KlBbgnAt00EZwC3dH-kNvd-KuvTvXat7U.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AMyYYK4FDILaU5scmZNs557gKjlFw_-WMZShDpT8Rvs.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5367325929738414304.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-rm8ZwpC1cOk4tNHh9BzkiUrk8uJLsYj-Nw0fWjvti2Y.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-r_-gfDObr-rjXy4i6ahfMk3zInyc_41UjMG-OqPQSIg.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-SzNQ5BvuVVBcxRkKly4pGcwWO7VFvvEyjOpR4KWneJM.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-o1ExzQhqftE527ZDzHNHckNHvjz-BsiFTAT9gWMMcVE.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-jrzp_8NMwRE064wr2kTbCuP7C69Vy8Ibz1ZUhBYdSeA.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-P_mTew-DCz8BCbaqDkB1kLpPkODRdL5rA9zq-iISxBE.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-bc4nMcUn8HejhSQVIhvkn1TmGYXZ1ykfs6xEmhKItJA.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-1lUgX9VWok6xkiBwBLH8cbRMEATi-7D8nF3G_BAX3UU.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-awQzwBIGdskFT0vPOIhBsktzGKZzb4xIvDpFlz0ORBw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-FZLjeW23SJ3Qv0WCq116pnT2JbFoFm2AvKoG9woi9Cw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-_AVTMh70TfsrS0Bz2rSyPCaXMNu4COxCxO38Avwd44o.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-m-pLm5f6ndINcxKueEh-_yjmlT1iztI4KrqeP7j3bvY.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-PYSz4T9NUHk4vqJkY12StYyfr_sCQv5J81PU23XQf3I.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-1L_qepIP0J8zm1Lfbe5XgBlL6Y8oG3QHAfSs14ennhw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Sj5hdW4zmfG_2bjGNPwig4F-0zEkXY2WaUQjyJpwl3Q.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-IBxKspjTBSznZLbIVQD--Y2mxdqiRmkQQgXgpgb9wOM.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 27 files newly uploaded in 1 seconds
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash d8759abd6ecad4dcf9d27834ba49cdd6c52e8090d467e6a01113fbebec3e1354> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2HWavW7K1Nz50ng0uknN1sUugJDUZ-agERP76-w-E1Q.pb
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_23_45_46-2154910386073476390?project=apache-beam-testing
    Sep 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_23_45_46-2154910386073476390
    Sep 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_23_45_46-2154910386073476390
    Sep 22, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T06:45:46.586Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:54.563Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.692Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.728Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.756Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.852Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.880Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.913Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.940Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:56.282Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:56.359Z: Starting 5 workers in us-central1-a...
    Sep 22, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:16.696Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:25.515Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:43.276Z: Workers have started successfully.
    Sep 22, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:43.306Z: Workers have started successfully.
    Sep 22, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:47:21.167Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:47:21.321Z: Cleaning up.
    Sep 22, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:47:21.417Z: Stopping worker pool...
    Sep 22, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:48:14.066Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:48:14.116Z: Worker pool stopped.
    Sep 22, 2020 6:48:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_23_45_46-2154910386073476390 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 776bac6b-d863-4a68-85ec-7cf945f4be0e and timestamp: 2020-09-22T06:48:24.933000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.213

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:48:25 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 52.344 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/g6ivnq3phs66w

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1024

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1024/display/redirect?page=changes>

Changes:

[ryan.worley] [BEAM-10564] Support more Avro field name formats when mapping to Java

[lourens] Support for Kafka deserialization API with headers (since Kafka API

[lourens] Assert the deserializer method with a Headers argument exists and

[ryan.worley] Test new mappable field names

[lourens] Introduce a kafkaVersion210Test for testing KafkaIOTest against

[noreply] Fix broken link

[Ahmet Altay] Clarify Beam's use of semantic versioning.

[lourens] Let the kafkaVersion210 configuration use a resolution strategy to force

[noreply] [BEAM-7372][BEAM-9372] Removes Python 2 and Python 3.5 Postcommit jobs.

[Kyle Weaver] Clean up CHANGES.md in preparation for 2.25.0 release.

[noreply] Update indexing skips for pandas 1.x (#12896)

[Robert Bradshaw] Dataframe wordcount example.

[Robert Bradshaw] Update changes file.


------------------------------------------
[...truncated 274.45 KB...]
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-kPOJA-6QBJ9D85NLtQFlpn6rnviygF4dUqbpLB8aL8c.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-BwCRWLAz_egrwTaZka7k8Nme9PvjiIe9kTCj2Uz_aMg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-UuxM10lE11CJV27JpKn69EclBr4hGZPnwOmaoYEoJ2Q.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-fdOWdW7DZaCPFxJkyWneRXjdarnYzOcomkz0xsBPGe4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bsUlK2oetWC1esWPIb3WlM6KzaHp0lEHhCuzZZvNX_I.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-JuefbEXL3q8UpKp9Nu1QF87frifPtqFqCXsgOG5gL-0.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-6aUoQk88yBL0hpiGunmQdlHV0X551PMIuqd1C2b0Gw4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ePgWWqPGmp9kJm5CPwVNME_ssFkvb5XHnVFL58FKJJU.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-NEJ-pWOU6c9YdB1rKFLpitBppgKZiTyZMn2B-qXviws.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7445096408310920317.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kfp0vBqKRMQPQjAfwAzRr15s1eVShrB1NTjVjVV62a8.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-nbBjVraRucE8ROUbnZoKebO5MOwcFY1INzG--kKqkjw.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-MqbQUaBhwJtNcJcfLTebqmuVgd4gzG2LVdgMh5d5lBk.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-IFSCCLD4rN0RWNVJzUknfM3Xkaho8Qw-VkNdexgZ9KM.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-VQHepCkUWgr2VngVXDnMHGr0fqZWsKhPkOe9l1yeCIw.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ddH3pS0Vts3cfdWc3s5LBPWPEv2lePQBe_kEKwC6rHg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-LPKXZdlxB0KTnTBLJaPMONbpCb30yFjwhE2Va-UjHMs.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ND_fvMKa__1hmBwW7NWq5zoMK5-zsypMPZev6LmvCp4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Id0HqpjNj1Jx_rDd26kUtZAZuKgWpo4iSeNpcG5iZQg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-klSfoXKX90hK8LY2xsLoR29TqkD4x_u1esUlyGX1L6I.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-taCHLv5ZaKG9qrh_jM7znCuVycUcOkDRlhBDtFfyd6o.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-8HvQPHexjH3a5hWFbHfM1NvUbZ28o3JdsoZNns32Os4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-4u7vE14XFXxjbe3I2ybdLUveORPuQZGcBkUz8PhMBQY.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-qu4yMABQIaDC2knDF5RxClcW6D4Eh8jPYZ85yvVW_Bg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-kPOJA-6QBJ9D85NLtQFlpn6rnviygF4dUqbpLB8aL8c.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-eFWQjkfD8JC6I3XThjsIHtVy9XrBp-veEgDArCo9yRM.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Am2uLOdksTm1KlXfBiNFYl1rLnBK23kt4cmtV7jc9po.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-bwC1BFTYp0jFDd86tdkPCyfu-QbUv7NfZ2zwP6KCNT8.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PBGBrWvdyDnrtRgWgPC3VTZQ8OeCnYmbMr1kpZsEgag.jar
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 27 files newly uploaded in 0 seconds
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95447 bytes, hash d71e6331a611eb02a54972c539a32cfd04d5741fc041ee3d86cd831185a5dab2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1x5jMaYR6wKlSXLFOaMs_QTVdB_AQe49hs2DEYWl2rI.pb
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_17_45_53-13556326403802827349?project=apache-beam-testing
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_17_45_53-13556326403802827349
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_17_45_53-13556326403802827349
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T00:45:53.626Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:08.361Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.254Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.293Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.334Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.409Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.441Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.479Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.520Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.892Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.972Z: Starting 5 workers in us-central1-a...
    Sep 22, 2020 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:30.725Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:32.713Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:32.752Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 22, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:38.084Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:55.214Z: Workers have started successfully.
    Sep 22, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:55.245Z: Workers have started successfully.
    Sep 22, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:47:28.393Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:47:28.556Z: Cleaning up.
    Sep 22, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:47:28.610Z: Stopping worker pool...
    Sep 22, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:48:20.518Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:48:20.561Z: Worker pool stopped.
    Sep 22, 2020 12:48:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_17_45_53-13556326403802827349 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 03704b27-bbbc-4d1d-8938-20b81197a580 and timestamp: 2020-09-22T00:48:29.886000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.572

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:48:30 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 52.271 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 12s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/imyddhr2zig3m

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1023

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1023/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10931] Remove obsolete ZetaSQL precommit Gradle task.

[Alan Myrvold] [BEAM-9136] Add python dependency license CSV for license URL and type

[noreply] Merge pull request #12721 from [BEAM-10871] Add deidentify for FhirIO

[noreply] [BEAM-9154] Disable Chicago Taxi Example on Jenkins (#12886)


------------------------------------------
[...truncated 272.37 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bkkesV_hyu8LYeXaVwKgWxSkbL-CG8XrBczCoPiW2FQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bSeayJLWxBw_cu5S_fxumPwbUOvSoDzMCUF4daVME6o.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3183716132999712181.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RQmS9mDRP9rX0gPXLOgEB8_odFP9Vgos2TnDs5-WU9M.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QoJ0CRh46Heyb5FNMYLPy7i0ajscBQS7u0YSiffD8iM.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-o_UEZLiKNFQT3f1oMHKSg2XRQSyIgk5QBHFG_bw696w.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bkkesV_hyu8LYeXaVwKgWxSkbL-CG8XrBczCoPiW2FQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-kuENJLzkXAD1wmqb62oaT5MRgDVZq9GgkKAbT77-8WQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-1rzuNmzPACJGr_U2CEL7ivj0yp580JM_ym4uP_EX6LA.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-DpnjcgPtd0jAJKYrO7qiprVC3c4CkV0FKUPrGJN3_sU.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-gVYYBaI7BffboAhcKm-mp-IPXhgACgwJ8KBRDqBdpjg.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8twWdu5UiILrH8o1NmJ1o01TH3m9f34GiRRhgJuMw9I.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-i9k6RAhRvwXP8suSvj3L4-W1w-HBwPXtp1tTlnywAjI.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-c95YYf44KPYn8wrvLmwrviZB7PwY_TtFLN7YwEAvVEI.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-JxVIPkLwIQCXMNFhQGmNzpEh_JA_lfep_ttTVY4bBqM.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-HEk_TNdp5P4UKkB3MCn5cS3Wvjah0jJ7VejLFvGCNnY.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-NA8cvWFs6oPiHwTKJ2SzPcb0miRQ1iYugZovCAeYlI0.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-G_KQJ8SCXxS-NAfPfJ4rFRgwrEzLYsiJo4bIVYc3UvY.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-K-Qwng6DCyyVrJD6EwRM3PTnXEug9Bhvgcl44sBtnvo.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GBqu-jxHLLW7hIhNzufuwgxaaFZK67TubDKBElDYNZU.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-RRdynZRXQakLI2t_RYObRZ-irr-CT0_kBYDmacHP8Yg.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Eym5FiSbRn58Qf2Aj92pN7PuwbV4s1UqgKYszqhNRas.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-oTYJ-GNCrkLDG9Aj_quFiW8eDiVTqPlAdK6FKYUAHnI.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-jMn6re_yXJ0Pl9ijzkGP_AsKTjZzzTkrr2eEo071Xrc.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-OV3taEl43NSFk5yE_rXeDmzXJERZGMoWA8RCduUgbAc.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-uRc-5em724zdllpmDcpH04X8mBBh9efO52ChbUXSpmc.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-YlejEEKwpJky8qA1W1SnjXJagvGHeYXUgpgmYAXigwQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-wGhqmd7CyFCl7W09g4ntP6M3VwwmblGRjUOtDXBvDTA.jar
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash cbb2688456fd0f2bce71368671012480a982acae83516611b33fcf560c59ed2a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-y7JohFb9DyvOcTaGcQEkgKmCrK6DUWYRsz_PVgxZ7So.pb
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_11_45_44-6589137125573011016?project=apache-beam-testing
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_11_45_44-6589137125573011016
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_11_45_44-6589137125573011016
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T18:45:44.937Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:01.195Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.758Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.825Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.977Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.031Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.069Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.105Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.642Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.748Z: Starting 5 workers in us-central1-b...
    Sep 21, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:28.507Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:36.005Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:53.785Z: Workers have started successfully.
    Sep 21, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:53.819Z: Workers have started successfully.
    Sep 21, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:47:29.614Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:47:29.952Z: Cleaning up.
    Sep 21, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:47:30.119Z: Stopping worker pool...
    Sep 21, 2020 6:48:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:48:24.578Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 6:48:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:48:24.673Z: Worker pool stopped.
    Sep 21, 2020 6:48:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_11_45_44-6589137125573011016 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7db6a064-25fa-4af9-b9ff-1a1443e14bd3 and timestamp: 2020-09-21T18:48:32.866000000Z:
                     Metric:                    Value:
                   read_time                    18.514
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:48:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 2.846 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 16s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/uxep4jr7ieja2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1022

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1022/display/redirect>

Changes:


------------------------------------------
[...truncated 270.06 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nIbf0K11fob-oDCqBbNssOu1qyfTkWsjJrSFze-C5mg.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-_eBV5rgLLYDIi-W2P4nXVdkeMMrBHVf1PcC5X59UOKk.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-DOTV-Q2KKRXvQWUzjNOOau0LcPO7G_yMStXdSQgegPs.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-97tPqit7xvPmHjZjWLhS2tWJlIz18RK73yqV97l391k.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-VnbLbE2bRDYHAGmmRHge-0NOJkiGeDIu2zCOh9xN_H0.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Cg_zHlIOJSOYdOpHgAyt24M6aQ6VLQpvjV8ERWd2MIs.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ByF4p5X2ppRHEBQuVS3ocGVP55Y0HWmRK8okQ8DSOAA.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-9wnRt0kI5SNUncqIcZfedAPGxV2dRMlu8ZWH1_hu8Rs.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-KGtpx13mGKZwSTfE6XiodK3CFED3Amgd08XQd_c0vQc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5h6MJjGJmjIEZ90tMTH6y2sVHjd9_cjJLK19h_vgQ5w.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_tZbLjk9LPVYNB6WEyGiSIlgmdlflPJN7u2IG4vABvc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-jk9autvdyJBBUc8MVGY2I1hLZmpu7aBbRRdAkpYTQHg.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OLlOWlGnq0hvaVAtompUNUPzLS21UXhzdTRDv__Q4IA.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-CnKKXiusrBiXx25FM3xoRnPY7sH6nwpJFPBZieNsMNk.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-GZeMiOA7AQB9dE6sj651exvh2RxHmCtcoeOWRuA1Bqc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ETF0M5siNMneFR8FjJcJTdjxncvkzO7wNogyKJoisRc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-9IY_ii2NmZSt4QoUN5pAJDMtlE4pSye3IrhTBEL47PY.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-rp3tDzWR1JS0vK31gYCPx5pwUpIqe1h4le7LMMpwwx0.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-xoKA_tLutVzQtP1N7IRBRsPg00Bu_AOl8chsPM6Ljew.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1377021143645443993.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-GbRFDd8pM7ntURXlr6H6iC_BrvRFWO-aHzveIQwG8D8.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-xVTavVfLFOomQ3qaLGxZz3W0mRFibfvQIzG6S0belhI.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-wcguua7igx-wkTMXyhsh-i5VGLmcLtgKTbKOoPW91vU.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nIbf0K11fob-oDCqBbNssOu1qyfTkWsjJrSFze-C5mg.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-t5QsrFk17M5Cf_4eqk33Onso9TNpNJm0UzySrlEK1As.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-N5lKUC7o40_K-2u2AgfVEfYr8nDx9jlJZ2vVxjQ9NEM.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Uj0GB5YU8VhBlOBrNzazXOvyYzPV1D-r0LvrXpWezP0.jar
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-P0kNsIlF1CNArOAKsNuVX-9JLgKtRpnusy1Pk26TTKw.jar
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash d2c06ad998d89a28854c83d08d4d22eaeaa0739b9ca89520b085a870989bef26> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0sBq2ZjYmiiFTIPQjU0i6uqgc5ucqJUgsIWocJib7yY.pb
    Sep 21, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_05_45_25-14451516347427614091?project=apache-beam-testing
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_05_45_25-14451516347427614091
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_05_45_25-14451516347427614091
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T12:45:25.248Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.157Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.908Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.947Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.974Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.047Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.074Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.109Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.132Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.451Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.531Z: Starting 5 workers in us-central1-a...
    Sep 21, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:49.054Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:08.781Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:08.817Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 21, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:14.225Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:29.246Z: Workers have started successfully.
    Sep 21, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:29.281Z: Workers have started successfully.
    Sep 21, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:03.631Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:03.776Z: Cleaning up.
    Sep 21, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:03.855Z: Stopping worker pool...
    Sep 21, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:55.790Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:55.838Z: Worker pool stopped.
    Sep 21, 2020 12:48:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_05_45_25-14451516347427614091 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 802438df-ebce-43bc-ac70-f3015c0f3982 and timestamp: 2020-09-21T12:48:09.452000000Z:
                     Metric:                    Value:
                   read_time                    14.016
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 59.705 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/pxwr4rz7ymht2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1021

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1021/display/redirect>

Changes:


------------------------------------------
[...truncated 273.20 KB...]
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-6fTx9uywdJQrIUF6MxAdHir1lilFtL-6yTgIrPyOMWo.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-_-sNR7aFlZTEIsLmnsqOOxZ6q76g3Pj7ppqzKlKPFj4.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Om1H7ZUPVgwTj0hrKoi_YqY-S9V2j9s1Be3hm2gE4q0.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4Bmif_KUfzJtoSs5pmltku_Wn5TH8LhuJXX_hDn9wI8.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6552647750856024480.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2fWDVH5X_14a46gLhnuyX5Ywl5KFf5n1rw-bHzlemRU.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-refVSHzfbHj0Sliqmkh4U9IdNGhDWhWKpC2F2r3orWA.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QilmWCc9ssWcY_FYPXlcPb-gPhSodk01MytSXZGpP9M.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZAc4M_-pdQ-56acn5R4etDP7G29_CGO9yT4QERDqSKk.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-CrZ8pcKoKxN2eZcyq5w422guuqwM5hZhGSS6QbbtkvY.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-E_9ds6EfR5TscZK-RfDKgQFXTkHpTMEvCPUCLrsuysY.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-6fTx9uywdJQrIUF6MxAdHir1lilFtL-6yTgIrPyOMWo.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-TWr9aumuJhet8FT_xLYGH95hLV8exr6WH-6wgzWVliU.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-HNyU_i2r_pmIc1awLvhCqKsl9a32oVcVfbSYib0njzU.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-pLH3Pwk3JvkXv-tcEIPyV-xTxR-_eTo2lnNjvhDBrkA.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-zRHeN2tPNO53cDzXoifrUNe06TMlqfrpDHZ5XLtbUdM.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-12MrU96Z3-2tZg1j_b3C_jZ6RNXpnJPPv4O34b735QI.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-WlcxWIK-85GmeHirXDiTlkHI_vmgdbp0zC7uv5tRLe0.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2lho7iOIc0ZXdIFGZKZDBOehBiCi1g3wt1NvA8hcfPI.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-TnxGoQT7ozEpQe-B7qFElaD2C9ddkhg0S6Oo4_eSLdI.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-J5A91O2K3-Int1VmChzgrvUh8QAtt3BVeYpGsFVUaG8.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-DNIGkA-1Oc4aZuP3vqBOBNveEMVebn6-Q2tWC2PwB20.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-L0cP8lst_WfIUyykhjGb5XK-SmtAUeE4Ev18YYnVtpM.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-jjKcQ57rYp2t8J8GvafKGA8csZLZTWVEILSeO02l4Uk.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KcB-Py9_4cK4ultbhMTnAp7bDuaB6Z_x70-f1IjkGas.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-a5P0JPSDAyU-lPXUMiTHDfdS3IKb6s4iPey3up7rikg.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-3tmnxz7e1ypyOf8pXXNkPi8SmVbi9b84SWXGb4Nhr6c.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT--6SvoK1lTkUkQq7_teHVCNHXpNxShoJaOtz1DlMmxzc.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.9.1/d313237180bf9f2f82e12f503d9617e6b070f792/mongo-java-driver-3.9.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.9.1-mxKxkvmYluxV-Hdn57uyt-MjjSQUsFjxFw9tjhx0bm4.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 179 files cached, 40 files newly uploaded in 1 seconds
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash f825ee5e5fff8e1a6ac38bbcfcf07b276114f4d108f77b3fbc79031c66a4ce1b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--CXuXl__jhpqw4u8_PB7J2EU9NEI93s_vHkDHGakzhs.pb
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_23_45_20-13725072535501620475?project=apache-beam-testing
    Sep 21, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_23_45_20-13725072535501620475
    Sep 21, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_23_45_20-13725072535501620475
    Sep 21, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T06:45:20.769Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:29.445Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.074Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.111Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.165Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.234Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.265Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.295Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.328Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.753Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.821Z: Starting 5 workers in us-central1-a...
    Sep 21, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:54.361Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:55.755Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:55.788Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 21, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:01.036Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:14.942Z: Workers have started successfully.
    Sep 21, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:14.973Z: Workers have started successfully.
    Sep 21, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:50.708Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:50.845Z: Cleaning up.
    Sep 21, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:50.941Z: Stopping worker pool...
    Sep 21, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:47:35.421Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:47:35.472Z: Worker pool stopped.
    Sep 21, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_23_45_20-13725072535501620475 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c9db7fa5-2ea8-4517-be9c-3aac3587a738 and timestamp: 2020-09-21T06:47:42.947000000Z:
                     Metric:                    Value:
                   read_time                    18.151
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:47:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 36.347 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 26s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/4hbqq4ub4sa4s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1020

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1020/display/redirect>

Changes:


------------------------------------------
[...truncated 269.77 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2020 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-a7tA5tBS8nZBQI45KsDmhne3vv8A4NCpmsuOZO1Bi4o.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5mdUCx09gqjJtt3yKez6ViUWOIbYIeN8eJHXZECS95c.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BH9hdVbnpJPb2JKSvlDCvIeBDAJfuECyFyU4Li6LjD8.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-EncFyGUZMIiWeZYHtfqlYxDC8fw4R34SiiyWp6HJLsc.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-qZq_30SZrL74Gdhb3cY5U8_ruWzCwczqnKYBcxMwcY8.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-0wi3HjsVbdqjjUp0HtCOrg_YkpwgX4S8cnqhMg-F9tw.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-e0QYuhzMplEOBImrq-dWwMf6v3CkktjnnEq6Wqnzv1U.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-n9bgR8SutXvTQZXjUVSq6oFM8bSJdD1jFcW4LFI5IW8.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-RFQQ8saCErSZYXVXcH8LKyHFEP5JEHhIlP1fbILBbcA.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-z8k0l-dROB984Kwl9ovRgFDKfMdGUBsmK7nrOApzOsA.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-WmkeN0RfUopmhBWSQuwYEALH1JqaqFfMEEijmFkfWaU.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AypiAwdb7EQQXTtXLVcs28f0fc_5QUhp2_4qKITzC5c.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-S-905rNqei-x4tppXM6eRlymeRefUyMkQRjKPwDh8tc.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-3Zx5o9dpS-gA4jbQEqbnQQMw-y880-TYMW_SwkFV-ZE.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-erJmYA0BAjITk6NyitwJ-OY-iAdrt6ZGdjzE-vuVLJg.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_U08HB7iF0ityRC0zJRVyCv9NjGqZEOucMs4ATU81XE.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-59DyPGl6YUU3O9mHFAEqOAOo_6EuyKm058QwirY7XkE.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Vav9AJIl5uFMHVbLhsYQcZAWlCYGCoVIeNsFriHMoCc.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-5aeyU6g0y5HRZrfeilWgDADHlVVu1TTdPXAjvDO0NJQ.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AypiAwdb7EQQXTtXLVcs28f0fc_5QUhp2_4qKITzC5c.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-HxlwmndQhB_MOzAZoKeVHKNe0wO9AbrCMQsdrRii2aw.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-246bVRi_NecX7eOW2ruf-AwU0ixnjo6G_vslN4AhI3Q.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-wZ-bV7GD6ylN0eXewf9tyKyxyToUgNigNtBPK5n8P-Q.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8682017055224343196.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-IvcUvEP_mYsvVkJa2iYsMEycppTufZ94StSpYKjgvjg.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-IjKIBrKV4TsOaGSiY1c-56gOutcWEmpKAitk4iZ8JPY.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0mTzrBtSI1TxW9FejwMze0zw5AxBucoWgsJth_OHQ5Q.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-e_gIaGGjPxBofLg9f_nU9j7MJXzFs8_ZmipQes0tjzM.jar
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 8f1785b7f3606c256e3951360cf31a8987c140fedf4ba8c12b4474f002aabc9d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jxeFt_NgbCVuOVE2DPMaiYfBQP7fS6jBK0R08AKqvJ0.pb
    Sep 21, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_17_45_29-3141953906935979055?project=apache-beam-testing
    Sep 21, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_17_45_29-3141953906935979055
    Sep 21, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_17_45_29-3141953906935979055
    Sep 21, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T00:45:29.091Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:38.685Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.191Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.231Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.259Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.341Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.370Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.405Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.440Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.814Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.871Z: Starting 5 workers in us-central1-a...
    Sep 21, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:59.127Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:46:04.214Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:46:28.085Z: Workers have started successfully.
    Sep 21, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:46:28.120Z: Workers have started successfully.
    Sep 21, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:01.419Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:01.561Z: Cleaning up.
    Sep 21, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:01.653Z: Stopping worker pool...
    Sep 21, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:50.301Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:50.343Z: Worker pool stopped.
    Sep 21, 2020 12:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_17_45_29-3141953906935979055 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1e3c5d81-b8a2-4dba-ba1c-f82aee33a059 and timestamp: 2020-09-21T00:47:58.520000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.552

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 44.73 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/lft327mz76b7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1019

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1019/display/redirect>

Changes:


------------------------------------------
[...truncated 271.00 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--n8sI4CMsMUf6A9n8cMkp2ASrwwodjqtZoSaJbbAK10.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-1KuXi7U_u5coqigk9z9FUXDZgS7hH1xJ2NuKhF4He4Y.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-BCcgjZTYD9f_9NEg-Zhv1PzmzkCOn6CzGrgzjOuYkOI.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-JHov32yIzKSa0DLSuq4wSaXTckolp4YAEnql3IIaMkw.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-HwE0-0LAcY-PLKH2mCvl1M_SgU6ENnCL4K7LTTmDiHQ.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-WETmD6qwYNlFLm7rWknfML2KE0K9HUN3BXITg2yR0ow.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kEEwbyak9Td2ergVHmtCEyi5vqM-ddjKqyg080CzJrs.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-K24HGqBq5KPIdGcLZpfL2l2-G-C2pAubTw5D4P9aKK8.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-AtkBWglXcMQJEKQvrk9TnotRZUvdjfmSOSM8xUO7xcU.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-XzBoiTaV3rnL-3pawLpf2V6OrNhVCUMPj7dwFDSLFeY.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-IrygagselvZf3LMgJgxCgNfVrE2ExqsDmW86Zgs0Yhw.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-vbGm4o1gQNwEqaaiZXVSImAgKigSVy-M7vx8mjfL8_s.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--mjqPx2plUbnKSXemMZesM3LYw6t762r97eZW4lV_Xs.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BVWIgdN7MHASeTh4nE1LTF-cBflGMHTYOdSvLbU_EJ0.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-dKgdqV3x1N1VLVJlRhin53Zl1Dt49YxaoxcSwwTTRQA.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-AaXwt1vL7SC5Y2zqaE5fLjwo-n1kxc0pJNenKxpL94Y.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-TUBbndA4C0MdUPqCKoKQ8TZ9xuo6opQnaM4Aw4zaVtM.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-oX5zE61xdpy71by5MrJN9T2gQQ93kJjMzHPDzIOTx-s.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4a9_ITi6g87KQSxM7O0_o1QLe1j0GkAcgskiY9-21gc.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_GPg_nrtCjkGgLSBgS8gyE5rCE77-txCoB3Z0iRgJ_k.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--n8sI4CMsMUf6A9n8cMkp2ASrwwodjqtZoSaJbbAK10.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-OZ5yZJsiCN4uGtfbQmUVTCbLiFHa700aqyyR-LRkEgI.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-RBBXkpjZMu_XpMNe6-o4vgjpQ2RhwnywrbW2_RKmz9w.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-de2PBm7UkXuayuBEEzhcQVquuiXAXYdYUSyF5ZT2qsI.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test564696704809297892.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TQ7z5CMWP7TR6p6QLSC8wI2BZfBNhrllOFoAhz6N3v8.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-tpwccbWLmPBtGSuJSA8i7Fw4GasiscB23RnYvnv-IkM.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-C0Kfl-CxBfU207cPIXmwn8SspOYuYY9_bNvoh9n3vDg.jar
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 6803426bb3092946937c27c4b5c99ff4dff73ebd938c7ffd3915876f394f0585> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aANCa7MJKUaTfCfEtcmf9N_3Pr2TjH_9ORWHbzlPBYU.pb
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_11_45_22-2079414324754370787?project=apache-beam-testing
    Sep 20, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_11_45_22-2079414324754370787
    Sep 20, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_11_45_22-2079414324754370787
    Sep 20, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T18:45:22.811Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:30.667Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.714Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.776Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.827Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.916Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.966Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.015Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.082Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.596Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.713Z: Starting 5 workers in us-central1-b...
    Sep 20, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:01.007Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:03.851Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:23.007Z: Workers have started successfully.
    Sep 20, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:23.065Z: Workers have started successfully.
    Sep 20, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:57.813Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:57.956Z: Cleaning up.
    Sep 20, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:58.029Z: Stopping worker pool...
    Sep 20, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:47:50.336Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:47:50.373Z: Worker pool stopped.
    Sep 20, 2020 6:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_11_45_22-2079414324754370787 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 73cdebb0-7a5b-4a7e-942d-33e50a300130 and timestamp: 2020-09-20T18:48:00.461000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.252

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 51.875 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/uopigm5rgwqmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1018

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1018/display/redirect>

Changes:


------------------------------------------
[...truncated 269.69 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-oCLrbH5nRwgRbrTtg--V-Y6I-QarhPhUCuuyL4O8cas.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-c-p10mpnzzPSRfVgRPigZvnDvJIwiMRb8YrX545U4-k.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-H0D4FDVjFNz4k40zRqu9z8zGBozfeKVsq9gmxvd9bb0.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-6BhFjsq4mRiuRgWtUzOV2jyU83eU-A6ZJJcF6TSO-AU.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-pb_BdlWbxsa7I1SZyANaEwNZrtifNnZ4PLBCcPO86Sc.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8664879349511158062.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Xf9JW3JzuEC_7vqMytNO0KaHJGKM2vft_JdfNU4w9Do.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-4sN-Cr0XChZPG52O-eTrmFKIe-B6umLQwM6r5U08CCQ.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-fAClgjNIPA_9ftloUxDNH1nrqtD97iHLAr_QwngPqsI.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-aIBDItcYCdWvXXJVKgthkobWvpQEY9Pwxier3ndGUyI.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Uqx5iGjlCf_HFo5pLS9KDIhCzlp6rj6PvhT-D7AiQf0.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-eGKspAM15TpOB434ZO52Q4mUoJ-kICezunG_3qDA9VY.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-cdD6DzWKBe3OFfhr-Cbi07h10XaGfmt15I7gpe9JEFY.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-eG4uKHFseINSjz-7elfwAdWSyqwLQ6Q_Ge3mXYEB1-c.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0RTjyiAJ-jJ-a1-grAowxf4ALA5wkWJ9IwTu3EW2ZF0.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-rkp96itfJKuE9Ejj4N0DsyyXpkhHvSyqwt3bGi97HUQ.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4RpPZW9nUvmI5UI-Z9gJt8xKyW1wfbZgSvj99bUok8o.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-k0623Ztsw68W2FlNjByjDmnKus14n-trryffhJSIAoo.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ZOeerKX0xcciiZPXqpb3HLBp77OmWK8vi6HElTw7Tps.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-oCLrbH5nRwgRbrTtg--V-Y6I-QarhPhUCuuyL4O8cas.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-xL8Pn7aGCGeF3C6ARM42GQPr3yCwBzRH_9zWJ2nu-PI.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-eTJT2SXCW0VF7Gy0lSGujhaC9He0YLeUxpa07QuMDHE.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mGud3fCKhgK5U6tF6cFClm2im3wFm1WwljZTd7NRbjU.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-d_5-Fhq_UZbC713QgjCH3hYa76RSv0WGbZLYi8irhNg.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-K_WEgUqE9dyMZ4JqRErO-Y7G-gjtlHhfV9BlQKxwed4.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-w_03CJhuMQyr7eQi7hip72Ha1TT9w2vaCmRzE1h64Eo.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-d9-KI235l5EKOpt0LQgdmqwgW-pyZBANpDKnPV9Lxco.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-YgHvC7-qgFl7OOusM2G3DDND0NqRcPztFUYMBBj4ZAk.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95448 bytes, hash 2974ecb3875e9e660e86f0b18c7b1a1e5ef7091a2fd02b89258867704f15089f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KXTss4denmYOhvCxjHsaHl73CRov0CuJJYhncE8VCJ8.pb
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_05_45_25-9017304659873718613?project=apache-beam-testing
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_05_45_25-9017304659873718613
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_05_45_25-9017304659873718613
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T12:45:25.541Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:32.679Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.350Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.387Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.424Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.505Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.538Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.575Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.599Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.911Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.993Z: Starting 5 workers in us-central1-a...
    Sep 20, 2020 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:47.014Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:58.302Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:18.636Z: Workers have started successfully.
    Sep 20, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:18.673Z: Workers have started successfully.
    Sep 20, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:53.348Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:53.527Z: Cleaning up.
    Sep 20, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:53.627Z: Stopping worker pool...
    Sep 20, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:47:46.339Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:47:46.385Z: Worker pool stopped.
    Sep 20, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_05_45_25-9017304659873718613 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 81589ba3-aa2b-4174-bb78-3d9eb97d05db and timestamp: 2020-09-20T12:47:55.822000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.469

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 45.58 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/yxtsclnm6nayy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1017

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1017/display/redirect>

Changes:


------------------------------------------
[...truncated 271.50 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-_nVyYi0r6LvvRIm_HbCASJVHmmVYXUSrxLMlXRQNRFY.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-dTlc_ZCSlaOfM5D9Uyn6-V6ko0nX9SVES3ttVg7ecrE.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT--LKb0kd-LTCHpNo2hlmJNGigLPcc1PJgtkg4rgeeass.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5iSKK1gis2CSiGGbt0cRn4F32yiqd6HtUWnn7cB5BWM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-L0Hg5Ck6vqugvY_EmRKR--jn8ZY1sv5y26QSfB20rWE.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-xcNdKVNkZKLT2cd5dwRQ6OmByOAJ2MXh-NZszVTITxM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EETnHVVu-qqwV84hwvbphUr8G8u78_HKPkzSoV_4IHI.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-JLd2AbkIS5t4rhSpra02dTVcLdwt8W-wB21gSa3JqxU.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-uOBQgzyjTVHejIslFOPyzEeirni5YO7D205rd4wrAdc.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-c6ATq0zlunG2DSA64wWwR1Zldl0ZemZr8BRSZdGzdSs.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test649361941710624551.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yyukRWmkrbt-Whpzlry0AkXBPo978sdZHGA6dqAef_8.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Jl31dYqWInTyYEa1gVe_gSQTtzF7nb4_P3ipJiFeRt4.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-naiOgCVdtEYBRX4z7RkG6QVCDApDsGc0EDEBaDzk2vw.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-JnmlzYXXYVvx0iQYX08oFctBaHnM-kHZCcaPdwpqQkA.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-kl3HhcIbzjLnrvuYpK5lnaESIBB8ngOw3kCJCZjktoo.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-GMzxYehMw9r1NDYjrz41JjiRrYyXgKYBcVNSxggN9UU.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-xcNdKVNkZKLT2cd5dwRQ6OmByOAJ2MXh-NZszVTITxM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DZO7qaztLcCeCUZ7oDld60ACvFwsLzhaAOJWJ5WMQnA.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-8Cswar7DMgTwFWX1o4Y-7VccsT9MNHezJ8HhTGAOGqI.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-I19VMW88TiwFKwK4otm2IfzZ2a8DEj4LguIkfFAN8QI.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QM_swhBkpsu_wLDDwcvpJsmZ_1Kcoyfo6HVgMknc7kU.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fpuZ35nYf_BDLD0TwuSqKnI9BhjMi_Z08GXToph6sJM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-bUhkxlSwsWTK7HWwszjZ0bkZDQgUPj8iC9ULTY0iloQ.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ytdcoqJq3zk-zaao76UnfHS2Dcp09b1ZX4tR9YNZL_Y.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-rNng08-V7oAOeWUkdAQBWQjtvroR8U2W-nqrrfhg-18.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-SCo4lMaDGtLJjMlsjpg31DholgROkwdO0U2nRg3eGDk.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-8K_H-LorEvR0OEAJsKe8Qtx6b_qYtpEoYYTkM60SSSo.jar
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash d07e77d70f4c9d51ddc017e0b215f4629b447e8d770788d5c33c8d6ead3c7e79> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0H531w9MnVHdwBfgshX0YptEfo13B4jVwzyNbq08fnk.pb
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_23_45_22-9055895011856506504?project=apache-beam-testing
    Sep 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_23_45_22-9055895011856506504
    Sep 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_23_45_22-9055895011856506504
    Sep 20, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T06:45:22.689Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:29.671Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.378Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.414Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.440Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.531Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.561Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.587Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.615Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:31.083Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:31.154Z: Starting 5 workers in us-central1-a...
    Sep 20, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:54.290Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:54.328Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 20, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:59.627Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:03.293Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:17.777Z: Workers have started successfully.
    Sep 20, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:17.810Z: Workers have started successfully.
    Sep 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:58.161Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:58.313Z: Cleaning up.
    Sep 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:58.387Z: Stopping worker pool...
    Sep 20, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:47:43.235Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:47:43.268Z: Worker pool stopped.
    Sep 20, 2020 6:47:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_23_45_22-9055895011856506504 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7f887d48-ea8e-4976-9a55-3aaf16a1a216 and timestamp: 2020-09-20T06:47:51.381000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.588

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:47:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 42.924 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/euxdobtjouy7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1016

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1016/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] Add indexing tests (#12856)


------------------------------------------
[...truncated 269.99 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-heHa_wLGFieenNyyBPn0dqVrzczMuMQnTz6bz2v6QcI.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-6ZdgnBTj2R5ld299c5TAZpO_yVNDhwFW25rYtx5Q-_Q.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-dFpA27fzYgAM7wl9c-900_oyisIvESzsJ-aML49v41k.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CPJh1uxmRi7Vg1KaLIAOyUH5il7rHREZ-6bnClZ6TkA.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-XDFs1N4sJ-OaYqsknoQ_J5E9Amo8anHQ-AJNKgQEk1g.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-PuW-ve8BkebTWSHshFOloXdXLusaXi-fdWS16cyHsus.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-SXUAy2mXTIMNPylVnC6-Ao9sXsyJnnMuwp-tvN7_dN8.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-jdzH4ebyXHyvMQ2YzL_m_ezwLZI1_EPLeG1HBXYFb0E.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-JpCW4sclFBx2kyg21XfnckrLflFnbMTq5b16Na45-gg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-er4PZM2swMchucoJe_QhH1CfCZS47PLLAsbrsezMMVM.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-GUqg28SYOxM6sj_53ss_pwkh06ptAlS8LJps2EQGBWg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-mOViXUTQrZR5AwGy0MOg-GIXi9ziVwLYWjK-j5gYqws.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-heHa_wLGFieenNyyBPn0dqVrzczMuMQnTz6bz2v6QcI.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qwT5lpbQbbk1dTXJ5Q4LMa5DsYEI3Qx2lQtfEP58OoM.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-hi_kDIVH5qYyPNk_aOz51BbHfyEuyB3DG7_9-il6aeY.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-yfZD2bWTOKoYRqQOIaEWI7wUI4Mta9vssLjgI6d6leg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-_svO8kyrR-7C3UtCofrHoREdY63y4I3-n9saPYgmM3g.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-u3CMIR07uvdqsPbyx35zkz9duq-j5Yp4LCC2MBoTjMQ.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-9GfR3upmCSkiUjtMf2ls_lGj_Xq5yh1TP6l_-2nMuus.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-XilDjs-Joy4Hx7WH0NjVH5cpPgi2Ou95mXgtw6oyYSA.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6907139141169186156.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6D7MfAnm8T-omOI8KkQe0bR-EZkiR_f4p8OI5zJLcXg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-gQJrwCN_ykFiwWPHHEer2n_lpl25WWvSmRaqpY9473E.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-vI7j-zPHE0yTAoyNMbzvkhcw8WBsfNBiuYz5oTl3fuA.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Z93aJK507GAEmZ-9TWP-uKZXDVmYs4w2pD41V-fPdbw.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-J__H8hAzdOFh0gTJAHdoEyn8T7onQlm_zLov7ysGmBw.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Ev3cdHQeQsYgNyE_nzHIciJbF9vJRc3bi8lcMD66fvs.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0sd2Goxr50Wpy-moi1YGplfmrHIIY_Pq0YDeIw4DEKM.jar
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 2 seconds
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 012f809d149121065e24c35783cadebff0af28d74e99c923899a9dd2e00ae807> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AS-AnRSRIQZeJMNXg8rev_CvKNdOmckjiZqd0uAK6Ac.pb
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_17_45_27-13911827363939590386?project=apache-beam-testing
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_17_45_27-13911827363939590386
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_17_45_27-13911827363939590386
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T00:45:27.979Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:35.925Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.690Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.724Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.753Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.819Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.845Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.891Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.928Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:37.553Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:37.635Z: Starting 5 workers in us-central1-a...
    Sep 20, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:51.301Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:04.697Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:04.723Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 20, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:34.966Z: Workers have started successfully.
    Sep 20, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:34.994Z: Workers have started successfully.
    Sep 20, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:52.010Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:09.772Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:09.940Z: Cleaning up.
    Sep 20, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:10.004Z: Stopping worker pool...
    Sep 20, 2020 12:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:58.203Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 12:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:58.243Z: Worker pool stopped.
    Sep 20, 2020 12:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_17_45_27-13911827363939590386 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15558596-0991-4951-b62b-38af2044a262 and timestamp: 2020-09-20T00:48:07.831000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.535

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:48:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 56.147 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/lm4bri3epqzri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1015

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1015/display/redirect>

Changes:


------------------------------------------
[...truncated 269.25 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ulMcp4VJYsJLN_rJ0toODLxn648w1a-pryC-syrNGD0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-b0Ze6bNw_0rpqEI0E7I8O-lV7DF9DUhZBY8h4kxNBzA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-F6JYJ_1Ka5TZKj5AK2RHoWUqSB5BuXEQae9gFtrZmxM.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-OalURPzLnuXJxlIRuGGWknoKQaBmljykW0WJahutqsM.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Uwqc0OQ7LdJQ0FjBVMSa0q62jIUd3qSAkaacLba1t3g.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hH2ZStc56Lp3rHPS04lgStIhWRm_u1rzHJI-dnQ61DQ.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xevc66nHLjvYzHOoB_jJUSZso7fqI0wYgDzj1Phz-o0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-XcclZuxmBHdXtcejM1L5YcCu1cKogWjK11dCKKDKrHA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xJo4mb3wVCZl3vHmqX93yLBZXBQ4yxyC6yLE7CYFEfA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-tj82YBswWvUaf58s5og2rG0IDNsL7gKzwv-JEKU1g9w.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7994822536720672114.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DS_9Hz0j9I2QbPIJ3NpFUBq2MjUlzAUMwP-uitvZsnw.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-T5azQ29BeJAiQZEWEKghceH3w0_wA5fnNqjv62yyx-E.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-lcRIwS4sTfr54vW9V5p3FTdv5V-BanWZmkJUTAYvUcU.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-GL8bH8rCvQsZKz2j3Ur11T_PdDHoLSpLpWG_xMLglXw.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ydzKb0sRm1qoHV9BhXIAQ-KSO_VbQy2V4QNfDkKss0w.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-6u0DGnagUPpSEO2qwPgIpzX_pGLH7jV9koA_MNb_jwE.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-M9AAXvlpRbyIEPDuTH0f-sXKSKTCBsd_WM7kd8TP_ms.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ulMcp4VJYsJLN_rJ0toODLxn648w1a-pryC-syrNGD0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-oi_uBYBoXj3d_H4Qsddsqjz1zf0D8t6udSVqZ8bx2Ic.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0CJ7iVO-Zay-MDnYzp3I1NXxv223jD7QrV-RX0G3cd4.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-q5I6-ANGI22R42wYJs4-0w97nvNpolx_wxqY50B7oyY.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4x9oN1bT_hrnosinwvByxh5XD4EPknGKnzn_EfW93eA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-U7z5p0UqBoyay0--djLCQCi0HkK6z3eA57muIwNuM00.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-oBs8VbCYdELfzZkMVamv93wlWMx9D2Nqhpllz49gQVY.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-yBBz7kB10Nk2K9qi0Ff_l4bnIiOGuTMXLnu05l34K_0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-1JCrGyNlUG6NnTsVgn5Brp7X6-eUcbO1D0EADBPnNgc.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FTvmZowqazh89eK5i57ilLmRjQYmGb7hkC0JHcONAzc.jar
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash da6bafbebc59823aabde6a862b85e43de66d3d56be5c5a35dc70edb92482f004> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2muvvrxZgjqr3mqGK4XkPeZtPVa-XFo13HDtuSSC8AQ.pb
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_11_45_16-7783838822090100067?project=apache-beam-testing
    Sep 19, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_11_45_16-7783838822090100067
    Sep 19, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_11_45_16-7783838822090100067
    Sep 19, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T18:45:16.764Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:25.241Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.268Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.307Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.338Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.411Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.443Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.480Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.512Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.865Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.937Z: Starting 5 workers in us-central1-b...
    Sep 19, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:56.038Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:56.243Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:17.410Z: Workers have started successfully.
    Sep 19, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:17.439Z: Workers have started successfully.
    Sep 19, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:50.222Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:50.350Z: Cleaning up.
    Sep 19, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:50.423Z: Stopping worker pool...
    Sep 19, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:47:35.822Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:47:35.872Z: Worker pool stopped.
    Sep 19, 2020 6:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_11_45_16-7783838822090100067 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 91a361e0-2e01-4ded-8fb1-4ad8dc6eadc3 and timestamp: 2020-09-19T18:47:43.196000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.249

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:47:43 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 39.221 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/22m3qu3ugomvo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1014

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1014/display/redirect>

Changes:


------------------------------------------
[...truncated 270.50 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-FbqPuZOxAoKykOqkNfSVIOrtfDxetX1uu6w0WxHYXkU.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-IpFTqO9Xdj96G8FDz2KIN7xWi2gZCxzxSKUuKDTlXu4.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-CBqr22mazMbS8M_kD2Px0gZmSwBasUDvZ2bQxEtLux0.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-oQJar59lhPJHWl1M6WxZTVE7LZp5gsVDtUBjylJgQpw.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-dMaDAEya6kd3r-XprMPtEnOBhQAsUOMgxtU6oafcWo4.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-IwK3prPQBJ504mgr2BiNKGZYlQ6vLDOv9vhw80aYT4w.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ZfxwkaIDdisYUy4oqL18GJolHOiW2Pg6rcGUWftHO8E.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-7T9v_KRcmLmtw78ZhfHpMltI6_B9J4WzDxlpjlBV1Zc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-QP94JTwSWtkaU5-abJ5Y205xaaodqk69fetrlOIoH3U.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-FbqPuZOxAoKykOqkNfSVIOrtfDxetX1uu6w0WxHYXkU.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-A83P9woBJaOjvDucrHzZGX1AbYkIfmNftEEI_Vdvqco.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-KvUGAJqtWE_PLiDfZ2j4VQUJqvcRDYZkDogWuJ36fZc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-dhQUgSyEzYxU3X7wOILEy-9S5-C5m0mU5SldiTw95jA.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-su2Y2wEVq6yt0KAxWpQsMK6Mb38_fzi9AL6BMUI7GLc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-iQkNnEmTMth00UkPcviISvC_zJGG6lVW_flPASSvnRM.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4650752779039135780.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WdZNlMhyD-g7yE7HO3cLtlW1Yr8QoBEW5os3c4xtqVo.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ViRxBNOYhGMCJ_3sbB43qT1oJ-6iHcyeVseKCcaqgyc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-WbrPK1S-QSsPMzzLID8iFDYQkkS49enizHanoUfVTAo.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-UeLokLtxYLUOIizJ6mfcW_iiqomb4zKMlkCKzYG5CHk.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VxKPG6Ov8J5BqxktiMV80hzYvYwvC5RNdhHv926XZps.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-YpmUARdmHUH0-VV-HWgEO5srrcbqpjXMUjCxoJioM1Q.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-GlWaxSJDMlNC6RtIaZUUy3aJjzSeEYBl27y6WO5LkmM.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-o8JjFyaMoYStElZbubEoXx5fIxwuKU8jwYRQl3Bwm34.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-cSmJU8aStObxLMJ_egvBd3atRXOz9kZ_TWulPEOo-R4.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-MwGsT2ySd6PoQy7cFvHwcS-UxypbhG6BM0do9vQCwt0.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xUohqBrrqfNX5TMAd0yJi5wE7Jl8hrTjsLawIwj03Ko.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Cj0914gSKYo6eLZLeKeQqzhGJuo4pELRR3V7ZyNBkts.jar
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash ec265b64f36a0c6ed5ca125635123c6dd3b4728a116eba5a079ffbce80cb24ba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7CZbZPNqDG7VyhJWNRI8bdO0cooRbrpaB5_7zoDLJLo.pb
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_05_45_16-877911829046410927?project=apache-beam-testing
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_05_45_16-877911829046410927
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_05_45_16-877911829046410927
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T12:45:16.908Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:23.882Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.694Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.723Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.764Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.845Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.875Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.898Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.933Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:25.403Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:25.503Z: Starting 5 workers in us-central1-a...
    Sep 19, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:51.948Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:02.969Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:03.003Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 19, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:08.308Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:33.318Z: Workers have started successfully.
    Sep 19, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:33.357Z: Workers have started successfully.
    Sep 19, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:10.927Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:11.071Z: Cleaning up.
    Sep 19, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:11.148Z: Stopping worker pool...
    Sep 19, 2020 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:55.083Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:55.121Z: Worker pool stopped.
    Sep 19, 2020 12:48:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_05_45_16-877911829046410927 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3f9102e0-ff57-4d11-b560-ac62857fe245 and timestamp: 2020-09-19T12:48:06.411000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    19.277

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:48:06 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 3 mins 1.968 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/ypxm5jgopxizm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1013

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1013/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] Add not_implemented_ok (#12857)

[noreply] Allow pandas 1.x. (#12869)


------------------------------------------
[...truncated 273.33 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UeVJtPNrxsbKQGUkv4ki_Uls_J1xxy1gEH0zprcdoG0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-XNZ_Jy6z6rRWK29lF2sAL4o9fCdclzDgFKhtEEnCOVo.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-jy5ceOPPRg1aC2qIsyTf1k_v-JBROaE9W6wAKt2ANI0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-rWQlWrHPijCQaXYpNWtJB9_Id6JP1-R8u0LeTgKj6ls.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Rjn72h7fan1NKEeu6MUYYn1I3qFX1djstZHn5DXnh4Q.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UeVJtPNrxsbKQGUkv4ki_Uls_J1xxy1gEH0zprcdoG0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-neb_gCfW7pIYMvrs8375_tdfiL-5Lcj0QkXVZrX02Is.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-jqwoW_hBDgRZziFsr66AUphNkJM6mIguNHFOaty-CGA.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-3JAZzmvEHt4yTZ2cutifHc3Aj5xuy1pAZKbUoH5cRXc.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-OUSpK5msczc1pFnFNo0BpVZr4JtwuvTTQD0goSCH2-c.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-u4-m_y5gz1W4WEZCLL_yWCmT9xJREoiWigB6IXuD_YE.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-PD80N8wLOcTH5OfDGiVk7eBsYzTD48PY1uUtkxiGjuo.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-nzlQ23BTVemPmclO9wPKfynisEBig1PVWGjszEAriWU.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-G15kjGYbxRaRkEXbD-G1j-lvpy5oz0XFpgaKNJS_lmI.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2674166951773094783.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TyvQSUjZC6sILJyjKUwLPhVLIRmk4tUvrxpuSsuLdts.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-gk3OZn7MEaG5K0xV0Mvv_mwyDYkDbnSMo10Xa3sg9Zo.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PsJgrRHNsvfcBbZkK34kWZtZdWo_snN7dSfytwBhod0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EZUvOq3mVYdAdrNHPpj1EkQisQT6VNXeu71MrvqVAzQ.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-KVT7I8WUJv9ttb1NPEioQUhMIsGPRREFLEnX4QfnkqI.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-o5DwlcBNx-6bByY-WfNUT89iwAEl2Q8JvqiK9AUUnCE.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-8qn_o8fVUdLcOA1bl19woXpfum9EqUsnnIreym422Ck.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-eM8pyd96ZM6IH10zISLCk4e0ZCfPiJzu1csD5DWgfmA.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-AMFvbu1FfRYF7_rMOiWWE19XFsXc-N1x8bGmSy6K_Lo.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-QtpqsIAEvmLDT9h51CFs-u8sk4DUzTgEPl6KEuzksfc.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-u2IGSduaGZy2VHdqW5-URFWvxitZ8rYdxFpBfqJi23M.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-jPHxUAL46QG3VrdhjGXpK7uJcrQ0uVopB39WVCV-9_E.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-VH8tTYBfvfPLefjKZZSbfrNo14zk3aWQPXUPflTxLP8.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 8d103c133cde3bcb968d987301ba462978c5d484a4b63e46a7bba6731f08dd0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jRA8EzzeO8uWjZhzAbpGKXjF1ISktj5Gp7umcx8I3Q8.pb
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_23_45_39-345060914686757828?project=apache-beam-testing
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_23_45_39-345060914686757828
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_23_45_39-345060914686757828
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T06:45:39.411Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:50.536Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.485Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.682Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.713Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.772Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.813Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.845Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.881Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:52.312Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:52.393Z: Starting 5 workers in us-central1-a...
    Sep 19, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:12.621Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:16.661Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:40.107Z: Workers have started successfully.
    Sep 19, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:40.137Z: Workers have started successfully.
    Sep 19, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:11.618Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:11.790Z: Cleaning up.
    Sep 19, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:11.862Z: Stopping worker pool...
    Sep 19, 2020 6:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:57.125Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 6:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:57.170Z: Worker pool stopped.
    Sep 19, 2020 6:48:05 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_23_45_39-345060914686757828 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): be47802c-f459-43f4-b43b-8f49178918b1 and timestamp: 2020-09-19T06:48:05.415000000Z:
                     Metric:                    Value:
                   read_time                    11.744
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.104 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/xr3e3qlpolfs2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1012

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1012/display/redirect?page=changes>

Changes:

[daniel.o.programmer] Update Beam website to release 2.24.0.

[daniel.o.programmer] Update date for Beam release.

[daniel.o.programmer] Update date again.

[noreply] Add a blog post for Apache Beam 2.24.0. (#12745)

[noreply] [BEAM-10894] Basic CSV reading and writing. (#12841)

[noreply] [BEAM-7372] Remove Python 2 testing. (#12872)


------------------------------------------
[...truncated 274.68 KB...]
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2Yi9CGTnn7Y_VUbe5ujHIJNoBLfXLlpZwXBrB3dIU2c.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-zhg1i-JbQ2GW3c4amKLsNaEh9ICmntlwKaFPjWlkg54.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4lLaBNq1x7zWEh2SjZ5iQ7cSQKDVULBW7TJqdcwyJbg.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EgtiGvh6ISwIyGku0q5SFs5kx4fckFQtKIjV8f0cTWc.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Vt4wwtIy9OtDHo9R808uNGEEkOE4hfePG4P-KiyhM9M.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DmrvdraqL-kgPe3xUWg7BZZS_xyWWLMOVVfQzTegSms.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-phtYkjMdY2wcwoaXCPlpVo7HbHSoHVTcP0AWOEBK9L0.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ygfNfKcijRZdgYAIFPFO33IltmQTpHkD3nCjEv2tNas.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3962022023690903602.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2EgzuFW5ynn0nKAAOpBZmc4zKbZ0CigOa-dy6qnvPhU.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Lf9STY8Xb8z4N223vaJFgKjEIcWpEqVtW1h4blBGfMw.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PLYCCHappCf15JBpjkkh_zuALFvzca67FZtoUcVPAy4.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-17x6j8Ehj5tMrg5cV7LlAaxJTJm78K-c3UDtHYnTCG4.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-oPT_GgzzBxyheMW6RyiNP63QxsE7UKTi47C7z1jNRHA.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-kCIOZdDacXIvc-V2ywNCnpWYvPHCG3Q_vfhwZyTfQes.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-K4J-zZ58Pu9ohIINxbEX8Mk2PvKBeCQsgW34Is74Uso.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gZ6OtLHkOOXXW3WA8kVFT7lwodNHdqy-2BZMIOCr6ns.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-fGBtxld6nsjaWz7GIsjq0kIXB28_c4D9l-N_qRCW63I.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2Yi9CGTnn7Y_VUbe5ujHIJNoBLfXLlpZwXBrB3dIU2c.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-SVgiVbjLdWYB7oAF9IKx5RPAibeaKeOTHXoMfmN4xY0.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-p186Exh2vfrrSjPEA-BfGzUyDGnBd6zi3QWhQdILGfk.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fVWzQh-z9vCa6AzULN4AtB1yLQBHtreStpDLYleeKOI.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-M2SUQE2FAJYDHlpm-dP9z3zfk-LFLigOyFFSRDrkENM.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-udd4eno328metyn7efY_Kfxh8iSV7fL4JX-6RMLqtTE.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Ze7ptGbDrRkkVIF-uIoAUei-mBmTfFPIF2UDmfFy8Ds.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-rjXA9jrtIqCCIOkx4r-7uc2RUf9wsOfFewiGsWJOYFQ.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-wmbVXMt8W0ebvXlQp-uQgcc2DXnrN41xt-QO8fefNTM.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-VFRvQJdGG6vbxVCXINpMLolDBh9keJ3YENJcbVuK85U.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7bSPUjllF5n2gHKnI6ADsGbhvpGHcqw5Bp4cQZ9ORA0.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-UUlRbYr6AtXw7HCe8geMau2TnmGesm9AfaPbtsMvTFc.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-7ec1aqVm7Y1m5LIAyivQ01102hJp8AI5e2oMTl7OGxE.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-9JDSEEISfD3U-RYLxQVez-9-4hDsMqYdltogCFmbER0.jar
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95449 bytes, hash 9737a2face4a02ccb4d213d7d177e37c96dcfb17820c59430e74bd9d3bc44dbc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lzei-s5KAsy00hPX0XfjfJbc-xeCDFlDDnS9nTvETbw.pb
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_17_45_40-1219052452420873710?project=apache-beam-testing
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_17_45_40-1219052452420873710
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_17_45_40-1219052452420873710
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T00:45:40.797Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:50.369Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.083Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.317Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.359Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.432Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.466Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.494Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.531Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:52.379Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:52.452Z: Starting 5 workers in us-central1-a...
    Sep 19, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:18.527Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:18.572Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 19, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:23.862Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:23.901Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 19, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:24.385Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:44.669Z: Workers have started successfully.
    Sep 19, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:44.724Z: Workers have started successfully.
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.545Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.727Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.901Z: Cleaning up.
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.995Z: Stopping worker pool...
    Sep 19, 2020 12:50:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:50:05.534Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 12:50:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:50:05.582Z: Worker pool stopped.
    Sep 19, 2020 12:50:14 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_17_45_40-1219052452420873710 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d63c45d2-48b9-48c9-b38e-d5a2a62773ab and timestamp: 2020-09-19T00:50:14.312000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.681

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:50:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 46.939 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 58s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/xrvlh5rpmohte

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1011

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1011/display/redirect>

Changes:


------------------------------------------
[...truncated 275.36 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QQFd0XqbC9Ls5yES1miCzqv_3C0zwbe7E5NU9bNCiTI.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ZfFy62PgFZG-ztZ4Vzk3foXcssa7hXwVYh0ph3rfMys.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-4KnOCdJu7qrCnFkEHTsuBYDQkF9ZNA6LWi7psdTUJ94.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bTFrHSEaDnHnIxdlRjHhitbBNqQ5nFOnbsM6gd7SzWc.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-xIK_KsMzSAlcvFQv8cN19f4VwawAZhLhD3tMh6RBaRw.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QQFd0XqbC9Ls5yES1miCzqv_3C0zwbe7E5NU9bNCiTI.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-YA-aN3bMfpmqzpYXH1fU7XWP8q5UJ1QfNJtJNlMMwag.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-iPsJgme7LTMcriBOqdsrVu3L4Eej1ukG3P6oDiW9FoM.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4249448483059047495.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qUDM2m6Y9rDRR5SQcWDwAJ1L8KnTBguYzXkz_baxsP0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-jvvXH_LHbRPVFMrQ4VOcSU5lXZwOAZ1JG5IarSh2mo8.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-riVQB_itpOaW57K2VtN4vwXEqhHamoV4hNXK6WAo9Xo.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-2Gz27CKWhGGEguqLMe6TTszVYh03CCarFi_Aw6qHKFA.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-VGZZiEJsU2qCJ2OUwN_teyd8duITC9ZFMfx8y6d27F0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-iY-ABmw_FbNGE3vQRbueokl6ubGitcRHYn2l9_JDpLA.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-CDnYvU8vHgdil7e-vFXHhPgvro2wqcA7Q2Drgu8tnlE.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-RJIXvvjvTOGVwsc3YaaVYldm1q-cAYy2m36WiWE9ISA.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-uzrIsR62vMJU-6H6ipgEnLWNExQXBq2DPwK0WOnsKEo.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-S872Qhlc-BykjXIu9XYMUD9ic-r29Mrb20GZNg_Fmw0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-WjZ9n4Xm1HrLEsyFvcKKmBW0Gg4HCOEeMoc526Yg2qE.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-MMRQoq23Cb3_-yodE3Gt4SneLUXWe0iQnQKrx2KUbMY.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Z3ccmo8xsdFLugS1cCBs1-inSC9PDTmGqAgIetZNwHo.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-DoargQ1S7PNijbwuKQUn2V8xh2KkKZJuSP3HFtoTOKU.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uZyYFHQs2SdnEIOrwjOVbCNWpnjUeOLYJ6p_TWFNoW0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-ojfwDcTVpVt5ShdvxqO8XJRy6SChMN_TsghwfOcaVqQ.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-5EiWONc7kIYH8-Az3JlMwUQYk8a734Pwbj9ShjaIC2w.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-UZnMcpNvXV1KOUomDeTRfr0F4LsqqHHAJ_xZgDdOB40.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-cqnuw5M_kSjaN2rX9WkEWY8pwl62m-_xI5Kb6Oz4eE0.jar
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash eeb9e3a9200f1004be402737dc429a3332e38aa447ebbacac5bb7ad47e9018a2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7rnjqSAPEAS-QCc33EKaMzLjiqRH67rKxbt61H6QGKI.pb
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_11_45_45-13064140472509915854?project=apache-beam-testing
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_11_45_45-13064140472509915854
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_11_45_45-13064140472509915854
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T18:45:45.953Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:53.877Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.733Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.774Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.816Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.881Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.908Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.943Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.978Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:55.312Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:55.395Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:05.402Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:23.630Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:43.419Z: Workers have started successfully.
    Sep 18, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:43.469Z: Workers have started successfully.
    Sep 18, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:47:14.203Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:47:14.361Z: Cleaning up.
    Sep 18, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:47:14.445Z: Stopping worker pool...
    Sep 18, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:48:06.340Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:48:06.594Z: Worker pool stopped.
    Sep 18, 2020 6:48:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_11_45_45-13064140472509915854 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9ed7cc16-ff7e-448a-ab11-5b3577d519e6 and timestamp: 2020-09-18T18:48:23.976000000Z:
                     Metric:                    Value:
                   read_time                    13.335
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:48:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 51.908 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/a2qzwhedcpalg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1010

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1010/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10643] Major rearrangement of Nexmark dashboards (#12863)


------------------------------------------
[...truncated 272.86 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AMiRSEmMxzNjmeqB_yOF3N0ef3CbZkTF6sBfVq9ygTs.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BO_WC9p5L4dZqsJE3lBBlhBwgz4GzDpo-cPQ7V2_MOQ.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-UErWmYbcaJp4nP6OwjtF0NMAywPsgljm6roqA65x7lQ.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bBVPp6w6McbSHbeeWWnhFmftY2qWzHzqskQ-6kDYnJk.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Yg2FmC5lyEyHE-4w-kiZy8UaoiB2jT9yougfEOPvOPo.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-GVxBBlWFCoQJ8ULkXRGh8JVmgAFtQPE0z3PJEkxV2jw.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-LbIIH8WE67uCdqj0PeaWVifxdUrrUuFtapqxCbafJI8.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-8pg9ZYUHyFl4aeOaNQaCb4Itg8OJ9DwNVdJ13A0M5eM.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-XCPrgjUZ4ogxXxmpOi0b8RMhOpj67iBVWecXDwL58dA.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-8CmpnEKH83MFyZGOUV3rh_tX5nDRrfavJZssCthhFxo.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-cWpxbUCiZt_LrFwcAktwV0WCTDTyfa1G20YlZc0ZS7k.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-95RwWBSZgKf4qLG59Kszq-0YGyB9qKLkX2pAg0TNa1s.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-hYSbfo3pnk077PsdUxaxJOVRgoMkR3-DvXyOvmneNQA.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-rU2_fCxYvlkfEBIvJkwlSJLQUAEeBIHkcwircVN5htE.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hGFgEDPO0wGGL7tlNAwbeViO9Ma-Yw1vibylGBpyZos.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-LxPqVH6ftX3slPr2A5a5Ffynf47fcDDNtfl9slTGyiI.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-38xMweGTtH20nU3zRWtkJ4t8k5KXR3VkER7bZEU2xFE.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-aUveCFTeJuZ1qs_wMcVocbL5VcE6IWL8fMv-AcpXWbU.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-TjvulBBT76G9tCj920LXnyU7Z4HUsUZUgwfqGMnLN6o.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AMiRSEmMxzNjmeqB_yOF3N0ef3CbZkTF6sBfVq9ygTs.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-uhXPjygq-AbBDI2swP6yyaxN2_Ht_5vdk6mXb7Jv2Wk.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kaJYFo4XWcZigzwuvTG3oP3eHTr-ZS2966TsfYzsKKQ.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vC4Ham4L3XTD-lnpcIRGloYJ7AvidKmq4mtO-CWyoGk.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-KM4VFkSQffT8EZgskVZq-TQrmS5ywvlrxasmAYTwvg8.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test63759738527024413.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-785iSX3wunNCZ36Jl9hgI40frI_HptkbqkFUxP4_BKs.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-RnJYGv_YbmFO_N3GkXd2nzGUd4nF5uAW1FbfkqL2qS0.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-CCkW5EPbgAp4Fnhbd_pa817Eld5hGHrlMYbdC92Sibw.jar
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95443 bytes, hash 5eefa4ca53de8963d6348bd097829546a01f277f7d351f84b1f1581414ac10e2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xu-kylPeiWPWNIvQl4KVRqAfJ399NR-EsfFYFBSsEOI.pb
    Sep 18, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_05_45_25-1696692630971199277?project=apache-beam-testing
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_05_45_25-1696692630971199277
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_05_45_25-1696692630971199277
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T12:45:25.344Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:34.885Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.101Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.139Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.168Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.249Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.278Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.312Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.347Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.690Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.775Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:56.153Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:05.373Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:28.221Z: Workers have started successfully.
    Sep 18, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:28.253Z: Workers have started successfully.
    Sep 18, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:57.000Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:57.157Z: Cleaning up.
    Sep 18, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:57.220Z: Stopping worker pool...
    Sep 18, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:47:51.425Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:47:51.466Z: Worker pool stopped.
    Sep 18, 2020 12:48:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_05_45_25-1696692630971199277 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0dae0184-da99-40e0-a847-e9ea84891107 and timestamp: 2020-09-18T12:48:05.353000000Z:
                     Metric:                    Value:
                   read_time                    12.805
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 55.902 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/zi2mackkcaumy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1009

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1009/display/redirect>

Changes:


------------------------------------------
[...truncated 271.11 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-S-TcMSkp7XzlimFnooGTslS3MIa8xZ4pqDmYJKUUJvU.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Jd0pIhk3KuBZe8d_28SppClPMghB54vZ3aULIOTt_N8.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-TrQIOw7zmGysyIVderOm3j7s0sFfOHlsuLklY3XpWzM.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-RpKzYtNFlwUOcZO7klHjUMV7LhUlVqXqbvhGf23aX8A.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-xoxmgZmuCNVXq-n9D5u6r9285_CroDNys4CznogZPnU.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Z4fJZwg2qQT0848AEmJKqVJnkomdIHmFAc8lNquLhBE.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-272pZrh7Vp7DHnucE2n_mqa4cJ5kV7N6IVijLxSyyuE.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-uySmmi4tT1GwgLuZMxxt2-zBSJ9ZB6INzhG8mU9_2R4.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ScoYPYTJnIpPmc9s5WeWLX8EU4INbHZRQlHXbOHSj1M.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jj9ySUXlTLWjCs__CWk7ZcGcp4VtotFiweXRagDgrzI.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jvTk_yRge5kbR2IX-4W-tOmCcbjgnQL_Q8REfThNWno.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kctoKjiTjxF005nQse64ZOzRo6_D-OzcdA_hXvlQSvc.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3099444028583100172.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ea3JIODxhwJlJo00_Gt1BmGMG4DlsudSG4Afqu1dfy4.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-YSs2iYaTRYW3SnsD1Crs473fTSgQg3d8vpZ8kbAnqaY.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-rGtVlYbz5kYgL8FvEeo-pAmiq9YqE3kGWsuiDD0xb64.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-DZcjUZB62q0ojM0Dg93Kc5-6rVcwWEamihL-nISzqkk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-VYq3UEncn4mynf5aP-EjSwgSe8bQa_gfJmnWL_kIRCk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-YGmyY5e7jBYy5shZqiTKHE797G8SCd36R1Ms0ajGNEA.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-YSaob-q2BXf9cccXPWde87n88g3eDsfNdyvGf_q-oCY.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-V2kjQgGFI9K0gdM5P2F47_ZDtEeEIREluVQAhHJNjA0.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-5C5D6aFpWFpX9YUvla8rMF8JggnRnYQqUW7pX1pgYOQ.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-3XHVLosiu8pvHdyWugG2m3YGKnRt0QUbASgt1LHWpn0.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Nlo_TMqF24T1-OJhvFAM7HHbEClR6_0I_DJd09RMcKs.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-bniDQlfKdV_tXhvQpknJ6lUGfzI10G_RKEsuRrhYXIk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-CXF_2kus1XurAUTm8W5Hh14C4ozXacVK2Q87hNxA-fE.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-chRDcGmRqy-Eih70LEW4fiHTwfsZ517DXEoTomXYtAk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-S-TcMSkp7XzlimFnooGTslS3MIa8xZ4pqDmYJKUUJvU.jar
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 698d7b1e4a13f0f9668a948e8fc4ed222f5855cce3a2ad961b95891d7f342883> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aY17HkoT8PlmipSOj8TtIi9YVczjoq2WG5WJHX80KIM.pb
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_23_45_18-4367868156119383329?project=apache-beam-testing
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_23_45_18-4367868156119383329
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_23_45_18-4367868156119383329
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T06:45:18.888Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:27.496Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.200Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.245Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.286Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.355Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.392Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.426Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.460Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.957Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:29.030Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:53.177Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:57.136Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:15.123Z: Workers have started successfully.
    Sep 18, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:15.170Z: Workers have started successfully.
    Sep 18, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:47.178Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:47.341Z: Cleaning up.
    Sep 18, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:47.426Z: Stopping worker pool...
    Sep 18, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:47:42.116Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 6:47:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:47:42.169Z: Worker pool stopped.
    Sep 18, 2020 6:47:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_23_45_18-4367868156119383329 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9dc4001f-64a4-49f5-bec0-f5dc03a38dd6 and timestamp: 2020-09-18T06:47:50.455000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.495

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:47:50 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 44.717 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/j6ndljpiaw5t4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1008

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1008/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Avoid re-encoding row types.

[noreply] [BEAM-10921] Fix flaky unit tests on Windows (#12866)

[noreply] [BEAM-9543] Add blog post for MATCH_RECOGNIZE (#12735)

[sychen] Add a step properpy for shardable states

[noreply] [BEAM-10906] Add basic Select transform. (#12832)


------------------------------------------
[...truncated 272.81 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aVCC1aTUHkFA9N_9b-jtKfBFaxZ2xLt8kpty6oe1CKM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-8WSQ-2c694nlc_dHVbKQuISY0sNHbZJiGXJeudbUqyU.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tctD9XG8HT3ZVV-FhOTuOxhA1QHF8MikEuccBzSchY0.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-BkQYnayGtDC5W1aiN2h8Znosyfw36EY0PZ5vQz0Dr7U.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-vCoekkR21IscTGa6SOf0nl7rZFtIqrBMg0zI-YBGzpM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5398045425933865601.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XeN6gw8uJA7OyAr_4RU4hv5P20qK3EtdBhzj8iULs1o.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-2aNej5MizvRN-8IFp1GNdEr11onFc5nkBDc9o-u2CZI.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-l2PoSIt34MKPEfmP9E228vnF4gXzy6VzgifIZGp6KOw.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-wUvQSTqyQBxtbVadbmsq8q6W60ZPj66A6sxJ4qDb6bo.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rMmkTQ6P67n839Ckhz2ynpInp42W4i2z6gxm3917Ysg.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-vSExpXudm5LFQgo-TjT-WTAN1t388Py_b1RRNlf2XwY.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-sq2I4dKnxZcRsxU3-_HmihU2AW8IN5pQ5brF1Yy-uvM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-WLtoQz9w9SXRAi8oOTPxcT4yFQ0Nbs_WRIk-w_DzBdY.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-PKrMEM4IV1aIbQpob7aX3Lnn9riOmpf3niQVL0a2GO4.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-dzjp5cBvP8PZlK0Zddocmpphr0vi8VNzA69Q0m4-hiw.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9D9WxCtr_HPNB9gHPtw6PTRKjC_Mjvt1w88728Ky-1w.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-iKwFRj9hjbKuZPorvZ8W5iCkNa96uRsx_Vh2PazeH7k.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-byzdTYOmwmcqLBtRtgQ1j5Zbx9Ad31RHCmAbXR4n_LQ.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-jH6Dmz9tQBLJ1_W_Vke9qzyFHgv9yD5p-ND5Sc9dTmU.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aVCC1aTUHkFA9N_9b-jtKfBFaxZ2xLt8kpty6oe1CKM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-SUACEgwd4F7efaBIzj6lM280u2ZuPAfGqrlg6mAGK20.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-octkoyjZC7WPkzzBh0ppnQIad_8da20vY4c_gQx16nY.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-dXzY_2wBQwSchD_JGXLUTRKkvMDn6ar6Wjy3uZ_0LA4.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4Ic9U1n845UZPO0YBQhcUWPZ1dpnQSn-jJa0TWVAJeo.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-kamIxJ_qZq5P6MgKVsOWibnhthiBn4VEkCpdwiSKdlE.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-rdZZbhkLsA8rD-dPfUvZvZGdE6RzGnFboDEXMqIHeH0.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-77dX7Jg24U1pckKMYECu1OJiuoHwy8uRqH6Jsr0DY_A.jar
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 1ea4bdfbc970e1c18aa63ec3fe287a2e05134ac1cfe8384f94d6114af27423b2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HqS9-8lw4cGKpj7D_ih6LgUTSsHP6DhPlNYRSvJ0I7I.pb
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_17_45_42-4193983501382752725?project=apache-beam-testing
    Sep 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_17_45_42-4193983501382752725
    Sep 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_17_45_42-4193983501382752725
    Sep 18, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T00:45:42.654Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:50.842Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.407Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.471Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.508Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.585Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.614Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.649Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.682Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:52.016Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:52.080Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:19.490Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:24.425Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 12:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:40.993Z: Workers have started successfully.
    Sep 18, 2020 12:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:41.025Z: Workers have started successfully.
    Sep 18, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:47:15.374Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:47:15.557Z: Cleaning up.
    Sep 18, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:47:15.654Z: Stopping worker pool...
    Sep 18, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:48:13.663Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:48:13.693Z: Worker pool stopped.
    Sep 18, 2020 12:48:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_17_45_42-4193983501382752725 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8435896c-4c63-4fec-b45f-72b5010abea1 and timestamp: 2020-09-18T00:48:22.697000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.995

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:48:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 59.809 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 6s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/ozl7zw6kdtz2a

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1007

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1007/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-10889] Add a note about BatchElements on GroupIntoBatches pages.

[Kyle Weaver] [BEAM-10915] Fix error hint for AVG(INT64).

[aromanenko.dev] [BEAM-10816] Make KinesisClientThrottledException public

[noreply] [BEAM-10620] Eliminate nullability errors from

[noreply] Minor GroupBy doc fixes (#12860)


------------------------------------------
[...truncated 271.10 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-qXEpjlCUqavIOGmw4Vw_a2h43e7gSaOjSrRmkaEix8o.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-MwXGb0dJ-ddfpZ_2N33fFXrPXmcx5-sMoYRPYz73o5A.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8528050366632878484.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FsZJ4rfEQ9Mj9OxRb_LscoxTWSi8_X0esgGlr1tBAEo.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-LMFD5MWW9FgpvRjxtRHrGwVel4m1EzK0TkFwlHx1U-o.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-JaHR2gf964VCQknonNHHtZooWrEmF8eBAAeJfab3CHE.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-8LpE63cLrv_bk9gHWfCfGEiFE2d_PXhf3lDk5qvAWnI.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bNW4SQnWvnEz5pHQjrZv1o5wlfS0FaCrz4Ov7ZPBwE4.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-UngQF1ckQ6pSNQRrjeqQIQ7RYEeQN5VSD9p01oqAKec.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-h-JHJbzySfsIpIZPv6uoidQ4EHvmCz7uyUloJaOcZ4Y.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-b9Sw4wMoAExZDdZ_s4apKjYX9bbaSMZGuoVhM5XTXB0.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-mmeqxUta7a6S_tsG676A9pZ-Y2jdC49UBKcXUPVEhKc.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-CTPxdQ817usY1GEBi71jDQzSPWx6-P4HRfy20kfAy6w.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Bz8kPQsMFX10UAeCaKvyJPgE9582UYhNUyHNa7_PcDg.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-fYz99pwiTpG356ffiDhtYwXpowW9LfO6CPzkidincI4.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-urCAWLDsV3BkxmtbS7UR6XsctrEYfGmIH6kaG6FKCnQ.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-MdWBi18iaz_np2iQCPdaGBQQo3OsdI3VSPGhPLiZhdA.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-AcYAf4uuIOR3HoyjySDGv_CWyslHrCsk35z1Jva7jOw.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-AqRrzuBBv0KmBO4u7megEFVhzGRSfgdxwggY8F3IMH4.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-aFCGfbG68LAzdO9iwRQr8cfc_RneqZGXrXuP89Pi-cA.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Bz8kPQsMFX10UAeCaKvyJPgE9582UYhNUyHNa7_PcDg.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Mds5QPF3xY2g6qKQHo63YXW_xkumiT1MNRkNSBeNG-0.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-kmJYLx6IGEejc_URH4NLV9MXSGM5lxU5lodANrPAnxM.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ueqlmTTesOxNv3UdvZJyEKyBfwpLH5cCwCYGxs7tmRM.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Vj-xPRLRtvspxzYISVopJcIppDkzsGXo3aTADK2qDRE.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-nYjYYWJG9bkJX1mvfpn6ySQKOW8F4mHhgZdanbZxMJI.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-yLCbXobMYkLNrSok0kAKODshF940QZwkYOn4JlDmdIs.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-y3AScwVGo6SVURVPxxGSxDiK_w3LiYVtp-e_kFwt5eU.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash d21253fd5518a11325f6536911b7d7bb8b83ee6c6e9c7860a0866bfe7855b056> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0hJT_VUYoRMl9lNpEbfXu4uD7mxunHhgoIZr_nhVsFY.pb
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_11_45_28-14493791183980167453?project=apache-beam-testing
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_11_45_28-14493791183980167453
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_11_45_28-14493791183980167453
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T18:45:28.432Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.318Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.930Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.958Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.983Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.044Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.071Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.095Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.120Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.572Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.637Z: Starting 5 workers in us-central1-b...
    Sep 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:51.149Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:05.063Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:29.950Z: Workers have started successfully.
    Sep 17, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:29.991Z: Workers have started successfully.
    Sep 17, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:59.582Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:59.711Z: Cleaning up.
    Sep 17, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:59.792Z: Stopping worker pool...
    Sep 17, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:47:50.779Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:47:50.830Z: Worker pool stopped.
    Sep 17, 2020 6:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_11_45_28-14493791183980167453 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4f781dfa-23e9-4e30-8b5f-cade3f7a14c0 and timestamp: 2020-09-17T18:48:00.296000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.685

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:48:00 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 45.432 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
107 actionable tasks: 63 executed, 44 from cache

Publishing build scan...
https://gradle.com/s/gjct57odd4cas

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1006

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1006/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10870] Add raw private key param to snowflake cross-language


------------------------------------------
[...truncated 268.62 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-td7wroA-6gpSwkIDGT4XqfckNmh7u3c7hEzU_9Lalv0.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Y05vd4q3zGoN9FVYpkrcCEJpCBpM-iz8bNMT2kNUGVI.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-jLL4cLkqUaPDlylaM1Dyw7QbbgVIQILvuIov-R3vOEA.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-t8uGJ3rnXBe7EH632j3W40NIArS9ejEGh0qGq4iWCds.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9pWTMmlk8sGM9cYcaNuJAcfuuQlZ4Mlm43-8bYnhnpE.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-jqmS_054vJ5FzJwk9745-ITzpgrrQ3_GqjBpXxvFmbU.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-eQunNtEiEW01-C0xvX6Y7sw209-q_KyChrV8GS4KctM.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8069727198249426218.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qoaYzzIxvkkqwzWWUng4Dx7u2AtEDF91e9IAXMGkklU.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wNxgNd3FcStoSC4YHtGvBlrInEZRtX1Fu2HYegJsgQk.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-LDmbLxRf0x5DocuXxboaqF0uMbYlssks-t4AIncRFDw.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-vbxTIlqpP8xNluXCNpwiUBNp711SMyav_L-3bCUX-WY.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-td7wroA-6gpSwkIDGT4XqfckNmh7u3c7hEzU_9Lalv0.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-zzXC-sCD2xT-SxkU6Dxz19AQxPNWoTdFfIIjj-9vBKQ.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-6sWyRkHV03Mqy3-CsnsR_sU_iDbE-wE9hh9jwDcUqoY.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-hc4BIT90m57vMgmVq3F7t8aumFkK4fbF2KVBCKxHTdA.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-wAE2WZXW9EYmXR_bK4zkdTeGoNVWdLGeO5rBOY7rys0.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-ln5KDF7ob7FKZmV34Nj4fGXHbwjmIsQ3OYTPaIWH1TE.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-F9IVOEne_wEFhfoxW5xwqZlxTS5XawhcgHffTZlTexU.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pjeqmkIXn8pPR8-MpnZiT41Op5_yxvYoqYSNmSf6PDA.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-wU7S5U2qxR0A5GeRUGJNbFO0mtY-OMavKp1BY2nfTV4.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Yd4udaAlE2ubO8kFfTuSwMLQDvaROPM8iq024EeWVL4.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-nAYpPQ7R68GhMd9hlVSCbtO1N0avva-5OoE7Fj4-c78.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-royQyvJsCqmvcusF9ryKD9WigCm1B3azDgN-YdOz3sg.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Ex7-IqHXD_3BMED3E3u1Geh9PhX_x-uANM7w2-FvMWs.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-p6pnqsueQShMX15Kqr9IvRua_BulQyUmWYAxoYNnoM4.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-xkJT0phqsjM1PoOT9gaPxq2iceqhR1Qfz_KXfh2Zsps.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-XrWb5_1NsbKRJAOnfYjE_kfAC27u-fAtdmtqbOOs3-o.jar
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 2f94c181363d75b388aa9c6fc7ef5948a12789fe5e23bcc01baa545f8d0ba66b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-L5TBgTY9dbOIqpxvx-9ZSKEnif5eI7zAG6pUX40Lpms.pb
    Sep 17, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_05_45_23-4785705186495463209?project=apache-beam-testing
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_05_45_23-4785705186495463209
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_05_45_23-4785705186495463209
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T12:45:23.162Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:31.308Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:31.943Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:31.984Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.008Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.113Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.196Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.283Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.352Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.938Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:33.025Z: Starting 5 workers in us-central1-f...
    Sep 17, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:53.253Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:00.868Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:24.601Z: Workers have started successfully.
    Sep 17, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:24.630Z: Workers have started successfully.
    Sep 17, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:55.598Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:55.806Z: Cleaning up.
    Sep 17, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:55.898Z: Stopping worker pool...
    Sep 17, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:47:38.965Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:47:39.027Z: Worker pool stopped.
    Sep 17, 2020 12:47:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_05_45_23-4785705186495463209 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bf9c2f37-8e61-41c2-8f19-ee1f502a082f and timestamp: 2020-09-17T12:47:47.743000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.248

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 38.314 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
106 actionable tasks: 61 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/b5aikmhr2en2e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1005

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1005/display/redirect?page=changes>

Changes:

[Robin Qiu] Support UNNEST an (possibly nested) array field of an struct column


------------------------------------------
[...truncated 272.86 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_RM7kOkRfEdXWBeUUb3im2H0Rdb4HxqU6-AJK2Tq3aE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-XYwLPrd9mePoib2WOLYNZO2uFerPfyPUBuAtd8LRycw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-4VZE0kUb91z3TGp_Q7ack0ksRExoYDZfFxxzZuigQrM.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ExJAD_lqmGhMVzGh-Rqwr6XWHIsXa99zNbdi-9bH8E8.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-_DULmanrRQWUXj_cKyxnTxxXJKI0YYDUMx8mGhw0vtU.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-AsdNMJo9mjaJvW3i4Spzjx3yf9bQrUMgQSjDiLEpqr8.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-MGm-W3ZGSKlVUnz9FmF60sYqDvs0TgYKMUmLb3Qt_Gc.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5W6OvqVjNGwAwSJs00GZ_eahnPkxd3VCY3iRzDz_YCw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_o1UfVgkShY3zgnLQ8EsfpHFhDz_KLd7tltsS5JMB-g.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CN_dHMRaBTvQVYrDrmFELjEkeS-OdyIub7nWtXLQIlw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-gJ6xMxFjwguN3ILDmIb3n86hQdWkXpXMNm6uxVWOekE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_RM7kOkRfEdXWBeUUb3im2H0Rdb4HxqU6-AJK2Tq3aE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-vw00BcJbMllBP_jxvxk1Jbvv_EkD3zg04bNKS7sJ9y0.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-dkOM8r9gB5ASDujXPVTNFT7NRz4obUR1Y6De52zh0sI.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-P8WBhQwNbug-LAF2lwaoB6Q6KKzAlOO-TEyMoJOIKwc.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2194341051306486805.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YmIP4KUF1gdkDuK8kdG7B7KJzNtBkokpTmXTqdBaElw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FDcexH2fJ0DoA-wTUkBjiSrnj4skj-ftqt-hP_zwcz4.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-NZrmXY0biEQClW9QofjhlynI2MK6ByHKcGBh6pFxxoI.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-W34W2XyBzpCahbEZF48u4XhBTn2SwEGtxyChL5xKJew.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wvbkFm4GFlWR6AET7m8fQHm-8-63iSSdV9P95IAZnlE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9_Zt3HnQEIJoJc7eBfvuhVrYqj5XjJbFiKff7U-hW6E.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IudMEVamMlsfuBwd4uWWW_tp8P2PwTxafhrUPBbKJQ8.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-hyrL-jfO3UHrut2VVqB3iScdMY8GNrVBb48hMbb6soo.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Zz1oM3z7cDhY7q3BL0ZhQRkmh8g7Ei7XdQ5Y4HYq7GM.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ToQdRuf6kNofHEnIJHUwBjPTS8KWygWW-ZSqsGfWCoI.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-OOe7PLlvDJvu3OlT9BAQFIJvO-0FD1lB2RiU-QB0fsU.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0nEd9DBG5LDDvh-_fdzH9X_QKZNmVTu7r_oDuQ3xKKg.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95442 bytes, hash 420fc2311209167709222783b9ecd12167982b7eae3ece26ec6e1e631eddeb48> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Qg_CMRIJFncJIieDuezRIWeYK36uPs4m7G4eYx7d60g.pb
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_23_45_49-7910135258459698124?project=apache-beam-testing
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_23_45_49-7910135258459698124
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_23_45_49-7910135258459698124
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T06:45:49.664Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:57.209Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.140Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.172Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.191Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.259Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.311Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.346Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.386Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.855Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.934Z: Starting 5 workers in us-central1-b...
    Sep 17, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:31.425Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:31.761Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:50.487Z: Workers have started successfully.
    Sep 17, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:50.521Z: Workers have started successfully.
    Sep 17, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:47:25.423Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:47:25.690Z: Cleaning up.
    Sep 17, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:47:25.754Z: Stopping worker pool...
    Sep 17, 2020 6:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:48:11.550Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 6:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:48:11.595Z: Worker pool stopped.
    Sep 17, 2020 6:48:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_23_45_49-7910135258459698124 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 40f482cf-146f-451c-a72a-0f31fd8c9273 and timestamp: 2020-09-17T06:48:19.427000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     16.86

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:48:19 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 44.304 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
106 actionable tasks: 64 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/xb6ex6ioqdleq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1004

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1004/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7009] Add Go SDK Standard Coders yaml tests. (#12588)

[noreply] [BEAM-10907] Revert "Deprecate obsolete CombineFn.add_inputs. (#12802)"


------------------------------------------
[...truncated 274.77 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:48:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 12:48:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 12:48:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dwCDT-LaeUvr1MTyxitei1cixow3WVIXTk4bE0iTNsk.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-QR1RCOtkHc1JIobo-Gl4Sjfs6uWLBmSH_m0XPvK4mvo.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-pHb_Ep2jtDuPVSNtMq_8q-Utf4VUrKGqU8b7iFcLTUQ.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-B7LY9BwObfJDzSE_APz2GfritLd_3ffBZ1cp2bd3KHw.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-JE_ebhMcnISAP2M4tdhncVMF2CPjzOzLYo1SZqlKKm8.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-63bujuWzyqaoqen8ZWNgvnANJXvWJ7eXCGADm70DjNw.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-I9ldQchvKXB64yhDpPCMf2rdtmEVchUTRhfM5wOiT68.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jZjoueqBye2g0Ik6O6uIv1Q6eSpr7czDXX45jywekVU.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9tDhgLDG5a0Ac6tWv9Mmmgbj0b9j54Saemuoe9N4Dio.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-vlZdFtQyl4jTX9O-bTLUwOgbn805zqhgKaZ8gZwgOcs.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-MmeC-bgAChdNiO86nvsnuetJRRcnfeDYBBHPq21tljQ.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-oQFP3FPr8TvOheXOs5A2vMhuU4bTzHqmThmOBH8REP8.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-ih2x0hjeuOCpb6U1dpKPnLe_7A3W7KJg0CNdd0MPE6w.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dwCDT-LaeUvr1MTyxitei1cixow3WVIXTk4bE0iTNsk.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-IuvruH3qzDeor1m6E_yC5zVIvpwg6SZMM7fuuhhlD_Y.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test47767524104222523.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-99YpDolLQfzKpIxHaVx5lgU4rcLHiwXFUSl2kuB6sYU.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-obiY7gLPP-_JRkl609Hkg6_n7HbTzupmSrkuvwxGg7A.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-NFMlFnqihkHmAmjlM86yVG8HVePOgDZfOfyi6ZeJtT4.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-339nP12PisLhqzW1SUtfIfnFFwN4rsk2P5fuIzPRnAs.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Hz6gwgzKZl_ZAGZwIJjcu2WYooqjM9Za4vRCR3yMMds.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-tXgeOOmtRwQj53xHp8OvBLGscZQMe8Xh31u_0KqSeWQ.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--opWwWGNR1jkDxBk0Q2BlrA14IAcd5Xwfln2I-8bpNw.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-40LaTqSdIeRQXP9-IGvKtpM0B4RN10TAtgki8MFqKzI.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Lo_FNlr4i-OwDbVDmvD2qrRGcJUYnfFLY7tIk_IGqZY.jar
    Sep 17, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JCWzKIH4IdtW1vEY4-8PmGKXOH3VqvX7Qd0Td2PeiKI.jar
    Sep 17, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-tkMlbEOvn8H850acISyovt1hy_mNbYu6BFJtUcNJMPY.jar
    Sep 17, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Ix7pLsAWsRvcEJRGU8-HF0spcWohhn_xIKf_LYOcHF8.jar
    Sep 17, 2020 12:49:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 8 seconds
    Sep 17, 2020 12:49:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 12:49:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95446 bytes, hash 8a472d4fece699f93fbf25b1cc0edb0b156a9f95d57b9f7f53b15c9604abd7f9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ikctT-zmmfk_vyWxzA7bCxVqn5XVe59_U7FclgSr1_k.pb
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_17_49_15-13930437868880093637?project=apache-beam-testing
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_17_49_15-13930437868880093637
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_17_49_15-13930437868880093637
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T00:49:15.046Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:24.396Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.517Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.562Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.600Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.695Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.736Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.782Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.857Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:26.540Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:26.638Z: Starting 5 workers in us-central1-b...
    Sep 17, 2020 12:49:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:40.447Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 12:50:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:59.469Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 12:50:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:59.510Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 17, 2020 12:50:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:50:10.403Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 12:50:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:50:22.981Z: Workers have started successfully.
    Sep 17, 2020 12:50:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:50:23.028Z: Workers have started successfully.
    Sep 17, 2020 12:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:51:02.491Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:51:02.691Z: Cleaning up.
    Sep 17, 2020 12:51:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:51:02.845Z: Stopping worker pool...
    Sep 17, 2020 12:52:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:52:02.714Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 12:52:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:52:02.769Z: Worker pool stopped.
    Sep 17, 2020 12:52:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_17_49_15-13930437868880093637 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c5608ce6-d82c-45ce-a37b-86a0ce5e6e98 and timestamp: 2020-09-17T00:52:13.496000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.247

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:52:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.08 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.076 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 46.216 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 30s
106 actionable tasks: 64 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/6ox3soav7p5s4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1003

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1003/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Update Samza to be opt-out for SplittableDoFn.

[noreply] [BEAM-10616] Add missing ParDo test cases for streaming/Flink (#12848)

[noreply] Bump versions of protobuf, shadow, other gradle plugins. (#12821)

[Luke Cwik] [BEAM-10670] Update Jet runner to be opt-out for splittable DoFn

[Luke Cwik] Update runners/jet/build.gradle

[noreply] [BEAM-7523] Fix starting Kafka container twice in KafkaCSVTableIT


------------------------------------------
[...truncated 280.23 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Yl2XWpcEbpQb8IE81AlqdzcSDw8H7Mlk9oe3GNnkxPQ.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-gxULC99WTlJrhwYBhsluMmHS5iE1GJ6uRzYXH6mU5dA.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-r9UbWTX1-XhymtznWY4Hdh1uymmTM-bkzFBRdFMFuq8.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tfbaX8oLaomDs8i8nQs0qj8u2YFAFKCH313WfgbfgCM.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-1OLzrnwI7niiWxh99ufD7-Xff4xkdwGnffDDo1rFrdc.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-H3v29aFEe7HJdzo7ZSBmUpomEkmaHP2c-CV0ZzJ0R-A.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-CqZselwo9o9bna3FVIY5Xyhlkamn27C9fFmqFv82-70.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test576096033016983519.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-m2roMSdhY_eudOjpI0ZTZVHoyeODVpTkOjAHEytLKvQ.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Yl2XWpcEbpQb8IE81AlqdzcSDw8H7Mlk9oe3GNnkxPQ.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-wimR6Y3IZkDSB7jXw2D16zus0wW07JNBwe5Pc0fs258.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-9QBXAYqHO3lPLX-IPnbvEfeCOGhbn8284PqNyz8LU7Q.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-g1A6amZ3x0nXWBRCkfYVVl-8dMoKYcC_rx_3_dzxMk8.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-2ax4s-6DTr6k95VblmZ6zlu2kh1Lmn6Fcy4btr22e38.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-NTOPEF-Mwgv_swHTSb-OXUW7sP5N_45_vIfPq2ldztw.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-U_2S4bp9r8Bq84IsvQn2uUuSUj-dzZDuozvCzIO6vDg.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-_kspdruASl9xQ4nh4T3hUB1o1igYN1ZnAxpUQN9e4J8.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-BDNL_KYfJoi3Ce9bdeSl3h2bXU3hD9GIGBHSmVezSoo.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-onSxJkBsUdW5-gwWv3v95LjO0O6aE--IYUDbBg28QMo.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-m-k5Bji8fo-dr5-KhrqE3eaL1jgVX3quYsGxYVQo18k.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-5BFdh4qjzpYx8p0PvmIcGO_6p6Isdv2Hh68e2wtx7pc.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-v3KFjF8h9AXqILNvI9dYSqKTmfKWIr0edRdTJHlfbpo.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-uY_hLbn__49xKN5ekkaZwLhL1MJsER8-uKq9rSeG-g0.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-oT0yCZsWfC2RMDc839vDMUGIDs-PhKpVwm1fpABri_U.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-dtn4fxJbNGzzMXFRekYxTR3r8LdMLspBFkEu4VoR-5c.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Z3eCX4IZYZo3KUar93sopF2U5lIBNxFuveT4h11Hhgs.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-MSGreGJ5UEZzZqvQwA-41yYnPJ4mBOER0mm8sZlGlJU.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-STOrKhKobyxOS8k6JO2BS9hj2DFTSTsAod1QWrEevEw.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-dtHXHi6CujKiitmz3UUXs4sXJxAOTCH4LUcDbZ6nvts.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-brvaluTKTtdQ37m4V2JEJoiBoQ6haTBylCIi2g9GpC0.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-WFXWkqVJU1NYU-2z1QyxS_IJcZ83fVMUvWoQJiraVo0.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-A5UaIrsp26x82yUPnEcESy_J1zHxN4TDWkS1gOM41KQ.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 1206949bda6f8270c45a12789f9266e5e1320c1f57abe8a5d370bb8a00197b86> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EgaUm9pvgnDEWhJ4n5Jm5eEyDB9Xq-il03C7igAZe4Y.pb
    Sep 16, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_11_45_57-17744232215389877017?project=apache-beam-testing
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_11_45_57-17744232215389877017
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_11_45_57-17744232215389877017
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T18:45:57.245Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:06.459Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.324Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.360Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.406Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.486Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.528Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.570Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.610Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:08.056Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:08.135Z: Starting 5 workers in us-central1-a...
    Sep 16, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:37.087Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:37.121Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 16, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:37.501Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:42.515Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:59.203Z: Workers have started successfully.
    Sep 16, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:59.229Z: Workers have started successfully.
    Sep 16, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:47:29.293Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:47:29.472Z: Cleaning up.
    Sep 16, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:47:29.554Z: Stopping worker pool...
    Sep 16, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:48:20.131Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:48:20.178Z: Worker pool stopped.
    Sep 16, 2020 6:48:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_11_45_57-17744232215389877017 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a27bc37b-aa34-44d0-b380-7e049d17937f and timestamp: 2020-09-16T18:48:34.336000000Z:
                     Metric:                    Value:
                   read_time                    10.627
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:48:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 51.942 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 17s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/lkhocbhcxclcm

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1002

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1002/display/redirect>

Changes:


------------------------------------------
[...truncated 280.15 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nmL4C9w2QukrugxR2tYGMp9ALnSsKlzNVHhdelznqeA.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-51RlGeYpMLMEN1cy2-xcXZsQYXBcrr0EylzNx6afg-E.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3621666794091506746.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SbKq1h-6cJdWa7uWAaspc3mgK05TWD3Mt1A9jvouOjI.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Cjd0mEtIPi4iqNIhuHZ8cwyj21eFUjSZS0RiEhifC5g.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-hC_3ZhlnOIqGktFDEfIieLA73NHSHeoZXpNTW_GVbsY.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-tTjW4T0uuBDNAWR8G8dSx0_jpnxgppnfCg2HCBdv440.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2u4n6rP2eE8VJxOLmH_keAcwTGrMHL9WQ7P1t2jYIoc.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-8iDAZ1olcbpAcAP3ZWEPbQDjpeojRJXTEzkqLKYgI-Y.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-jtIVPXzk5SlS3QE97P1UIz3UZ3Fuz449mrJhy5NeQZE.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-61OF19IX8ARU9x1ZCom0GMWwqfcza953_APM1XY-oTs.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-D-FurGy57UsWHZr_sHY2R4GIW2SDlLI8FOgSTvrnsHo.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ScqOSdz0A8TSWAyyUc3qbtW0Oj7uEhXmzS9D2FZQBUc.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nmL4C9w2QukrugxR2tYGMp9ALnSsKlzNVHhdelznqeA.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-UFYxARfDybLdYs1M9YWRL4unbDA06_0hy6o1mkZB1ZU.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-CFrDK_eBeqL8PBL2f9lOHpjJ_3XPfK21SmmyC_6VEVY.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Gkh0z8xFTJFByzWqarhUuXjNyyKcOgv-cw6nE71u2zw.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XtdQa5WT6GrP7AEKdk2WROMUcHhqX2D5FF1Hz9Db2cM.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-z6Y_NrUcj5lexv_KZrgeH4shWuAY_nOztNdBO22K5wE.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-PD2olcvutIRb4gMkrgXgb8gJUctG01zHtHbGiEJy4bk.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Wd8Q8Kwdabnsc4HXyKSVlbFUnzhfWvxdFigXZzcK2TU.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT--CmF8d57760uieCPQt2BQI462M3LUV1lk7hnK58fbU4.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-oHcKk0CUABK0k4BgFzuwZiE3xqxA7qTcabh8WG62uME.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-s6lSTliAYLc_-NIUzZlNaMZfV7PQ5KNAomDgZSSKyB8.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-7WyR4DZR6zlAE6aRrOs8sTCijptUNqDfn3uwaKWd_KE.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-XcZjsO20n54gsYr_8AWcnHIQ17Brv1cdwtQ8YOzDce4.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-dJPbPLqBFoXQsnux5Ccwg8XulV5J2t8Z30dTwiJENHw.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7PDhOk5qsdJCxcztkN-cOxHQ-EHzkm5BVqabLe-_QRY.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-tNxSBE3yOynJtJUVNFzW6qh_pS3My19WB492BE1R_b4.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT--y77zi3cm0sHv3S0e_5kJgHRh4WaD8C693Tjl68JECI.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-ZRs_sT8p2zm3Tqx167pZ4qWgWmp8Yurx23_RnfhxOuk.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-k-an3t3icmNB4haQ_71ChjTx8e9ZQzXviIXi74-4gLo.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 6bb2c62983fa666974b536ae931950511388c8149134b83ae65da77c41cac3be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-a7LGKYP6Zml0tTaukxlQUROIyBSRNLg65l2nfEHKw74.pb
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_05_45_21-15069650673728437973?project=apache-beam-testing
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_05_45_21-15069650673728437973
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_05_45_21-15069650673728437973
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T12:45:21.994Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:31.805Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.464Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.519Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.549Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.634Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.653Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.676Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.702Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:33.099Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:33.175Z: Starting 5 workers in us-central1-f...
    Sep 16, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:59.228Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:06.294Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:25.193Z: Workers have started successfully.
    Sep 16, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:25.223Z: Workers have started successfully.
    Sep 16, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:58.920Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:59.084Z: Cleaning up.
    Sep 16, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:59.161Z: Stopping worker pool...
    Sep 16, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:47:41.400Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:47:41.450Z: Worker pool stopped.
    Sep 16, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_05_45_21-15069650673728437973 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 575430a6-1817-4e83-b86b-73e1e6c09fbf and timestamp: 2020-09-16T12:47:50.062000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.657

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 44.102 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/j7gquxscxy7tw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1001

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1001/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-6928] Update changelog with WriteToBigQuery changed requirements.


------------------------------------------
[...truncated 279.91 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-PlpNHP2AVrXJNbpVCYWpS-Z5Kz6JbPH7scTSOfgCk3I.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ev5DcQT5jGR-WZq-ZWGbpDAnYBdzrSZOLoPTFMciiF4.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6521923645571006288.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JFiidMgtwWNyRF4AATwcugDaGa8HjqexZtlFLDjN7nA.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-dGN2xv4mExAe0vVuO6AvTokjoQ6Equh4E4fFZgZgAlA.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-eUq97lwZoba4ernCx7g_4dkM7TD4u9U9JAh9pnhh9P4.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-yNh6LjUMHh_kJys-7QxdS35_8Vk5oDeK8b05LE9dFQg.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-kPFfy-FS1ZIsXRxeq_szbhafPMhXbN1wiKZsr2Q5ZpQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-3_LUR6G-bgwJ5SkO6OEoDakkQrOBOcesNmoo2YnsaQY.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-j3AFvG2M051Tbfkwoz73Sn9o_YaawXH1ipljBFz2Z6A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-coSh5HSbpEU_ryJ4Bu57Id4QEH80LPGpQaD3LRKp63A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-5moEgJn6KDTBlC3hkxvtFyKY5We3V3V_zbVsDEMn2SM.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-2rtR1unJm3rN5LPLndIGRjUfq3AQCzZCddiKVVXHqtE.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-y14MkyiSEzT-IqW_QhASmP23SKbw9Cpeo0yszYgFGX0.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-PlpNHP2AVrXJNbpVCYWpS-Z5Kz6JbPH7scTSOfgCk3I.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-QaXjpTNcyq5_aqnYzipc-2BmSeVy_6eo55Xb6LKGkH0.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-0J5LjgbLAhfPYYjtlmHeahf951yUw8NoM-ESPgyhEoM.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-2UW7wmnGxYL3lVO-BZn3-s8CWTbd858ODbKDa-nZCMQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-YiMcUsZSVf5HdAvEqpEmSz-XCXAPgfOc-yhLj67FR8Q.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-26AKMWm945UPQn4IMJSEZPeYejnVEobl28ktKfrz9_A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-gnnWqi99lykg0m-tDAO_caa60w-MZKy_Ui2cvXIt1PU.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-fh5cXbudHfjlg7I2tBUGbywENg-hkEgGXvtWrZ2YAsU.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-uYWYszhZN9n_utjJDhLxX6hFS_YK503lmcwkKwGVl44.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-hKEoSgi-GaR-GcocZ2imeyPhnaapUaDxi-ldJQhnbgM.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-EP0nVgb-PYh6cwtgc6HsyRkld7qVQ3Dl_3Xw5Yw3yio.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-BFAjKe_Sw0WU1jkoRpxdGqR-46jv9pfbo0fWiIX5wuQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-nHcIVwLfEfBcnTTmLmrLBh5etRilNSKza2GNpE-40HQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-ygKlEV8J35L4GW0zTOvg-qhKnRDjvmaxc5VXNjdlL1A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-rZZGCwXmsJ155W3WO5gRaD3CE9OeYQFnUgQIJ_oHCrQ.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-PEf6frSkbETB4TH1Yx0n78hLPcZIUWUedONjzOVMyUE.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-4wSObsZJCp3ZrLPDhFza59FPheRpzA_YbjhVJd3SAX0.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-PrU521v9DYhdP7yFWvMX4b413ToxF3vy0Ws8j_YBeUg.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 22b54b2eb4203b9f91877121b281db6a76582d103a295ca5bae0d27b560e81a8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IrVLLrQgO5-Rh3EhsoHbanZYLRA6KVyluuDSe1YOgag.pb
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_23_45_18-14711656897969555355?project=apache-beam-testing
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_23_45_18-14711656897969555355
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_23_45_18-14711656897969555355
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T06:45:18.537Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:27.233Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:27.946Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:27.989Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.016Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.152Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.177Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.201Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.222Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.612Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.696Z: Starting 5 workers in us-central1-f...
    Sep 16, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:54.011Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:01.649Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:13.009Z: Workers have started successfully.
    Sep 16, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:13.047Z: Workers have started successfully.
    Sep 16, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:48.465Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:48.633Z: Cleaning up.
    Sep 16, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:48.716Z: Stopping worker pool...
    Sep 16, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:47:32.136Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:47:32.183Z: Worker pool stopped.
    Sep 16, 2020 6:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_23_45_18-14711656897969555355 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d7563cf7-33ff-4199-ab98-d914c5bc37de and timestamp: 2020-09-16T06:47:39.904000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.511

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:47:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 35.299 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 23s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ssyu7uvrts5ku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1000

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1000/display/redirect?page=changes>

Changes:

[txian] Support NUMERIC in spanner schema parser

[Kyle Weaver] [BEAM-9575] Only copy the Spark runner jar, not whatever other jars


------------------------------------------
[...truncated 281.26 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lGD4cGIAf4wy_fyqVk5_YIWRnk28ALxJT4TGBkU_jRw.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Uo06AnssB0aAmNpcBQEAUiQCxl2F4WBYoHUaeLFaA2g.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lGD4cGIAf4wy_fyqVk5_YIWRnk28ALxJT4TGBkU_jRw.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-_V_wBWmFrhMxWGphLn9ML5XUHiDW6G_PBDbAz_37q18.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-z-w3YVRrp9Glc1tis_27Kfn-DPakmScLNVEH8Mb3zTY.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-VgsalEcERZ4gjPz5KPUaN0cI4ZATcd9Rqdrk3qhXqew.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-sFulctzKu-dV22bydWUrwCkzgeW2p3eOdpRD0JtV7kI.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CWPT5r6rJD5cfc4-bRxnzJEFnHg1RQdqWZ6dCNwYd8s.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-vFFflHk4GDTJibYLe85xRkDKSCT6T0sty6V9mM8X5FY.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-1AaUqxuIP7tDekYmeSxeYyqvIfdhYcZxiea95M-M-xU.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Y9b7ZGo_KoAGs3mgEPXNUbASAOYkpHUd6Hw7Dr7C40o.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8MmMfKJ-6FCpONpFdE7OMgeIQG9YNURwj7LPMGlu9tk.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-3N4fEOThe_EWMjbg-8QidAhpt5ti1u2Atdd6a68L0EQ.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-I_1bwSwVl_MJRuVxMui4W3j1I035caR0Aq-XRf3xoVM.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Jwapr3wyfrmmA8kPWPOaw4YzOBxMvreA844_FlCHq64.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Y0ffWLKf5NoB0EupT33NMgWgAiYYNB5KtazvWRHEdok.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-VofPVylV91GZYKPUa2ONeBYkQzh4tGJNgUUWvFU52qs.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-pV7mNkam8g0lGgvwxaJ5vXQDjPLI7IxjWeNX7FY3VCc.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-PBOPummQvYy-5ZY1S05kXSoVZTpiy991fZZvu4uhc9c.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-zWla5RFV1-w2F6cXTc42hNJpsoO4hacYHOeYI-CQAWA.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-A2A8AjjE6Z336VZ7rjkCmww8kGc06nfO1NdbPY5-ONo.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-AGt20y6-Z8dDKezniRqkICWX4ISNnKu0i4He75b4aCU.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HGEmLAAo0aHHai1MeXwZpf5RAduFoZB19746DIYBVxo.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-fiprJUuVj7HOaxWftxckiOepgPWMBhgX00ae0UFk8Fo.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-uuiLvhHIKycqMdUu6ZtezK5pRnMxp6wYN5VQzQTCHQk.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2853864065336070637.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6yo2WsaosGAqjJrecyPnLXMUWFRwxtxiU0-TkYQHkxM.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3TDk2Rwrq5D_FAKaFgEnyxtdnZQ29Fy2dxKV8xT0gpE.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-RW95Xb36QedKHcRDVzwt4AqrYBTlwvci3wGMdjRJsaY.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-AfBjIucKF7iAdQiLoNZ82jyqQCVeU1O49aOEbmh11vU.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-osNcjQ7wV7NWrz9fnmoOgURqF9uh_aaJ5TBnpOlKoPo.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-CnZhDA9pmbKmHfR6s_x9S6fapbNvrZLEvD_mL2-0jm4.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash f37b7b724a9962517d5877da0e202f2b86fd1c128a20e4c7a90c350defd74a7a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-83t7ckqZYlF9WHfaDiAvK4b9HBKKIOTHqQw1De_XSno.pb
    Sep 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_17_45_45-2233762094182272882?project=apache-beam-testing
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_17_45_45-2233762094182272882
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_17_45_45-2233762094182272882
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T00:45:45.263Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:54.466Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.399Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.458Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.489Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.574Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.610Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.647Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.679Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:56.036Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:56.117Z: Starting 5 workers in us-central1-a...
    Sep 16, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T00:46:20.605Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:46:23.574Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:46:42.438Z: Workers have started successfully.
    Sep 16, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:46:42.473Z: Workers have started successfully.
    Sep 16, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:47:22.448Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:47:22.604Z: Cleaning up.
    Sep 16, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:47:22.680Z: Stopping worker pool...
    Sep 16, 2020 12:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:48:11.665Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 12:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:48:11.707Z: Worker pool stopped.
    Sep 16, 2020 12:48:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_17_45_45-2233762094182272882 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9f8c3184-fba6-4fa2-a7c3-6d5aee9f7c9b and timestamp: 2020-09-16T00:48:29.177000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    21.642

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:48:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 0.053 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 13s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/wnepw6sgbwugu

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #999

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/999/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Use index for partitioning of elementwise operations with multiple

[Robert Bradshaw] [BEAM-10873] Introduce partitioning session for stronger testing.

[Robert Bradshaw] [BEAM-10873] Use partitioinig session for tests.

[noreply] Merge pull request #12704 from [BEAM-10603] Implement the new Large

[noreply] write to file ability for java Nexmark suite (#12813)

[noreply] add readme file to python nexmark suites (#12808)

[noreply] Merge pull request #12807 from [BEAM-2855] implement query 10

[noreply] Merge pull request #12770 from [BEAM-10545] Assembled the extension with

[noreply] Document GroupBy transform. (#12834)

[noreply] [BEAM-10886] Fix Java Wordcount Direct Runner (windows-latest) (#12846)

[noreply] * [BEAM-10705] Extract and use the filename when downloading a remote

[noreply] [BEAM-10641] Add eliminate_common_key_with_none graph optimizer (#12787)


------------------------------------------
[...truncated 280.35 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UZHdoOJj21ON2QOvSX5n3yk2leyPyUl6AQF12vyb6qs.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-UXq3UAiZ9kiQXd4at19jok318qBl-rxmXgPMTM2W6J4.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UZHdoOJj21ON2QOvSX5n3yk2leyPyUl6AQF12vyb6qs.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-qsuLs3xYkAOqLE48zielqwQYBJBsclSIc431FEyJ5kA.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-cQDIu1cCC3JF6NVmjZyt_DGT0U5ikTVD5Tjzptf6InY.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-KXmwuBPfvMEumKQalOJcuCfqo9G8M-Obi4VVhdlR7vA.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-TFa5_GC643ZgbM86mPfmAAUd1o7211bnf0Va4LXBvIY.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-iYobWmzhXaik9CFAIehzJm5ZH2eEKs6Eyfzy9jVIRbI.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Q4ewG4XiYSmP95Bo3f_VuOoxxICzgM3ROYzKLbmtyIo.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-mEFbUWNeFQaej8odsy22gjaNId7EmEtVDlHewCbxfqk.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_fYwK--mE7D3twb3AFn3jM_XKbTEkJfmvT579Qvg6XU.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kUb5sVV273IqtqJrn9caYF4zwu8EOORlQXHeut_q4r8.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DtpJ4ITOv2xivGkCDRs9QAR_b0EhyY_TKQgI3iAlQ50.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test615900106991142805.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-85s-S0vrjskpgQd1WP9z8oU6rvEPO9_wMkC9Q61y__g.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-yxPy_3lzTv8dZcvyu0yM_sfoctUmtaR0HVRdGfWe9uA.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-rPcluKr2Xvp8Im0yEDofen2Klfe_yo_lqDLyFP4JFOE.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-3gTBw0njfCLG-M3faDG32a7khc-XLFATnM1gxelimjQ.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-BfRzv9rTnxR6v69iDXIycKGzFXnA42pB4rCpGMWSIr0.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-e2kfPC0uJ90jknnxDEtYZ-unrmr1oop3o3SUniifOOI.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-1LBtPnNKuL84Sz7WUBVHcUVeoFRp5M0ZZVAA9klFgYw.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-IuSGpBG9VxJ44QsOusPwk2ZP9omJxT9vRFVECdBVukg.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Vl8mNNFv3hdJ8T5BieJgTTe_DVU_7i--KjYo24hzr80.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-V3SqLQWTpK8L8MIooIubLXx9_iO6GoshtxgSCa7k6kU.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-PC4Yous5w1qVb4eGDRqJhwLeQ7xhnx6lXsIF7FsdiUw.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xK9rsBkaf-ChHlqUwCcGIYy2v4IojD9R7eGWbPi27-Q.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-u3nc60X54G_xv7xJJ1mkz91I-gbpcTOnZpjlZXRROI4.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-wtm7bpOhbIbglV9aKGrprT7g151603jCzv79QcboZr0.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-HhV_FnnzfEPrB3PT3L74cf0X6AEMGn3WqNom4W_bl9I.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-pNb6iXgweNf0Rw1GmiuTNmhSaX-kJplvdZs12skc9sk.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-nY5Wf_AXhMyBU2-y6tgCva0HHhoecCeueTGMhKpIxnE.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4RkIbBZeu1JZAoKPyYGH7VZmmvGh5toPlMWs-sOJtjA.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 8c489501e908f6cbedbe845f26bf4d87a7ae74e8834a357b800098e8f86e80c4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jEiVAekI9svtvoRfJr9Nh6eudOiDSjV7gACY6PhugMQ.pb
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_11_45_26-7269939878914622645?project=apache-beam-testing
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_11_45_26-7269939878914622645
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_11_45_26-7269939878914622645
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T18:45:26.491Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.218Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.796Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.841Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.868Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.007Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.039Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.067Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.098Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.446Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.520Z: Starting 5 workers in us-central1-b...
    Sep 15, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T18:45:45.698Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:02.296Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:20.652Z: Workers have started successfully.
    Sep 15, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:20.677Z: Workers have started successfully.
    Sep 15, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:54.009Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:54.164Z: Cleaning up.
    Sep 15, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:54.252Z: Stopping worker pool...
    Sep 15, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:47:39.993Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:47:40.038Z: Worker pool stopped.
    Sep 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_11_45_26-7269939878914622645 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a3d19f85-4a61-446b-aa7a-000d0a95d539 and timestamp: 2020-09-15T18:47:48.921000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.781

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 38.929 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yx3ka3iywxuvu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #998

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/998/display/redirect>

Changes:


------------------------------------------
[...truncated 279.42 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-v2B0WPzRzqqUhIB_3LizdqTa1tOtANczHhi0OSd2g8U.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-vBYbKYolFcq1YxMVwYMpoyq6GUF-Mpp5YHaTXHoCBi4.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-zGdc_OoTGwzCFex8ogRYzTVeVJXc_zQ_diTDXg2PnMQ.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3384055962582005227.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RX97iu5urtuPNJDiJKUPWlqgp7pb237fCW66EYq4Tgg.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ysrc6WY80cIE39dM1MdKq6ClwkTBImIhUNl2mTbs1n4.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Id36CA1mPC0LHQCzFH4K8s0zt95TWTcAK33uFYL0MMY.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-I7VgzsDyLXpsW1IoBglsQV7peS0yMBqsXLUrNYyzdAo.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-_nE6vVkOhSKbDuJ1Gx0eZEG5ydSlpq8wdfPsp3tSVrA.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-jNZyFWoDmaPsoq2PGaCTj5WShwpC-FA7lx0wLSioQlM.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-OM2Zg03xWg_aU3-D8blUgJZkuXimcr4t4tZduGLSyrc.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-gAGdM9qTvnOyzs1hRJUOclmtGpr0crnp49lOTB6uxfc.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-eRzhW0_wjzMyOheX7PK9v8Kjr4tNtc5VCZGwYvyaTyE.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-oA5zeblic9Owb3Jnnb_JIiiDUHkUb9qqooxjijSTRNg.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jYEyoI0fpp-s6Z5A7857eI8IkqKAC570gvRxZb373So.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-rEpBfo4k90B5DPdJSxHIHE-_00KmWX8ZqMLSzXqrMNo.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Gtg4Lu4_i7_D-LlV9yKT1ymCbfmH_3OIDS3e9D_0R0g.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-nT2ee2jiqI8BszraiJSYh2BoTk90D8814Gc0j8qpCvk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-mL0h0qhs1VsvP4zDZXKlGBx998y07rkJzEZcC0lOH_I.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-U4_qVfy-Ofc_cYjWpA58xOoF3M8OoJCV8m--JHLfMMk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9vAPiaaL997uPRJGvMK48ESJ-BlAqIQV36vEDH6h4Zk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-I7VgzsDyLXpsW1IoBglsQV7peS0yMBqsXLUrNYyzdAo.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-X1QTnrEg0mNUYlkqTODQrhkVJsTC8C5uaZ7krBkQBnk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-hZTmKWsXx5tB67HRGtL4BDXarl9YtDxqnC5ikB3jfQM.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_QXR8vKW6-O0XtjsVA8BhhLAmqFSaFSkB297klVqE8g.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-5dt0cx263kfYbgIbuNU1PRVsvqkmHdoSririLNqhpBg.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-p65iuUGrfPU4q-028-J5LNe2ex_txNAqQQXaSJJU6nk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-C0fZc0Xv4dAlnCjlhxThy24pVy_sXgmFRJwL1sArmF8.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ix6rcsjMkQvqH7LCZKvvbZj8pF_ZhjMiacBVPVuul9g.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-r8Bn7ZfkpOA6wNKqcSTPtcJlhwtLIcVYkc57kM97m40.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-M_pjHDaGTnXHv7dRqI7Hol0p7dD7nqlGGe4tYOiqH5s.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-F_4EQ1ti7iZ-E4Njo9_S6S3XB59gJvAgPZETTeXfNJo.jar
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 3414cf9cb77a590bcdbab65cd6228dcf31337cbad2f07d46d4118a7a4ab23f58> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NBTPnLd6WQvNurZc1iKNzzEzfLrS8H1G1BGKekqyP1g.pb
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_05_45_20-16360112088404731813?project=apache-beam-testing
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_05_45_20-16360112088404731813
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_05_45_20-16360112088404731813
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T12:45:20.883Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:29.780Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.162Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.249Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.288Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.384Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.427Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.467Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.506Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:32.099Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:32.230Z: Starting 5 workers in us-central1-a...
    Sep 15, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T12:45:36.618Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:58.801Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:18.674Z: Workers have started successfully.
    Sep 15, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:18.736Z: Workers have started successfully.
    Sep 15, 2020 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:55.652Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:55.864Z: Cleaning up.
    Sep 15, 2020 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:55.969Z: Stopping worker pool...
    Sep 15, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:47:47.376Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:47:47.436Z: Worker pool stopped.
    Sep 15, 2020 12:47:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_05_45_20-16360112088404731813 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 47795574-24bd-447a-8a7a-5df917ea7a7b and timestamp: 2020-09-15T12:47:56.340000000Z:
                     Metric:                    Value:
                   read_time                    17.699
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 49.653 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ppqho36if5enq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #997

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/997/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10781] Add


------------------------------------------
[...truncated 277.42 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-yWdefyEFq-9-2lhkV8k_vvEYer6Zb0TTxQ_RyQPVXVo.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-WBWTMtOFu75X3HZ2ejxJ-Akn2dimPy1v31QZkoXAk14.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-RgaDLLR1btZFZhcFI5b4pgpV_6qPLbsSvUm2or7N2E0.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-AjntAQc45I49bOMxBY0QGXz5hXpdbOb0QNAaghiL9jU.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ye8uB7tMxqbt16kmtZDwdtTsi69iOXbOegZ1WsNQRdI.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-q5bIb2bTTyEmCmVcRhGvKe4SabOGXdU721UnRmfLJSM.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-yWdefyEFq-9-2lhkV8k_vvEYer6Zb0TTxQ_RyQPVXVo.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xCU3VkU2o5HA5IR40MPV5qh7cYGxWFcC51ZZ39XZd5E.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-vbh2njJAv2JayAJMBmPFFMqFORaDHtYSEk3Me2byHBo.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-wVG24QdiMfKOxIyLmq38iCuaQipf9TzCjlDWV-swjkA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-LK0mt87eDaDI9xk2f7Zfa8g3faao_AQgbaPj8zjWnfs.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-M1Xlgmj4fYaDw4vf5eW8TvvQXedy9vRJdhMSgk_HK-I.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-wF1he_YGzIdXcOn5wnNusVMIOyKr77MO9QE3mkxqYXA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EA6Bx2Wm9qHzYtvFyP0FO0l6WKJMwlhw18dD-KJGrio.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT--_dhmW25gaZ-sn9N3eyAWmRYF_uWp_iu5a0va4ke3_M.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-uFh5njE1uvatDRZAAOEUUyoogVW6YeajZay1fmt5ffk.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test938548984134394369.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-njmWkOnvRTbGMWsv_riMw10P9kC5Kcp0CCNSjTQldVc.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-zU_Nte5CmvaGnyK3EOj9-Ev5-boPwTGQODKpzcHkiAM.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vAIh5uhem8tlkKUNAurxGaW--8SBJB1NqHZ1RETeANE.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-3eyP4uvZXem_quC_eHGscqiSGEE49FZ_SAl1-ozulDA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-lH2lK4b7cIKtYY4li-Mh99XRN_VtpQtjKBBdMkF7diU.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Z7FPeVS1hrLopYsjuCkvP26ft0TeCDP2_eRoEv-C1UA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VJUYsmhTCaQHyLW6GhmoqakNN7CBTrGQB-TMmxOEu5U.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-CbDekhj_fiF8MiKqFme7oSq3sHrs5DvRd6RFduYgW6Q.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-DWOwOxlCLA8uVOImAgo31751fUAn74F-vqrrW3eV45s.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-qCuE4txoHbgy3jys1QTh2gYIm7Gu66I18w3iHDAq8iM.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-GK9DLiMOvkLmalYkImnFoHPx_JsD45-rgSIAXq314FQ.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-s3utP4S3dlHwqdgTxAXfkmy_L_0TYGaKiOBhs6VsN_Y.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 27 files newly uploaded in 0 seconds
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 181f6574771acb7090353458b9ca45d17d37ce3e6816dff8f9710055d09866b2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GB9ldHcay3CQNTRYucpF0X03zj5oFt_4-XEAVdCYZrI.pb
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_23_45_18-8199746134253764270?project=apache-beam-testing
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_23_45_18-8199746134253764270
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_23_45_18-8199746134253764270
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T06:45:18.923Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:25.833Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.579Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.620Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.725Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.898Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.932Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.969Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:27.004Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:27.461Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:27.544Z: Starting 5 workers in us-central1-b...
    Sep 15, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:54.911Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T06:46:01.356Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:15.723Z: Workers have started successfully.
    Sep 15, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:15.754Z: Workers have started successfully.
    Sep 15, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:47.551Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:47.684Z: Cleaning up.
    Sep 15, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:47.757Z: Stopping worker pool...
    Sep 15, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:47:39.634Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:47:39.671Z: Worker pool stopped.
    Sep 15, 2020 6:47:47 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_23_45_18-8199746134253764270 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ba675ed6-d009-4d58-8184-4c700504327f and timestamp: 2020-09-15T06:47:47.205000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.039

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:47:47 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 42.744 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/dvieocot2yniq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #996

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/996/display/redirect?page=changes>

Changes:

[Boyuan Zhang] [BEAM-10505][BEAM-10530] Add truncate capability.

[sychen] Add max buffering duration to GroupIntoBatches

[noreply] [BEAM-10252] Add null check on logical type override (#12831)

[noreply] Updates Dataflow containers used by unreleased SDKs. (#12833)

[noreply] [BEAM-9615] Add initial schema proto documentation. (#12553)


------------------------------------------
[...truncated 287.66 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 12:48:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 12:49:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 12:49:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bpnVUet1L2gi8y3_Ydmhi_7xMzqW74CuDhFJEjpccgc.jar
    Sep 15, 2020 12:49:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-TdwaPIXn7SwH5RqnYQC6wXo9pJcw2NrHHRVH17RmOH8.jar
    Sep 15, 2020 12:49:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6cTrVIpU9Wq-CtHTPJC1E5XZePvzVWCriUzYoy4PJk8.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-GBZEkGS1JZoH4PsRV4WiM343J3CMStXhqa_Rcr8sGVU.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-fRrEJOk1YO5QV2MXdRBlJ1LhlwxEJFbLDMeK77vcOOM.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iAnCK8KCE204TLvrv7KfEd-9ZlTF70gH-ia2dOzeRbg.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PAJtHnKALAjmgkVWDuDFrcA4i_uKuf-ZAnVP6EexOOk.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test614034748031506813.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DvynTGUUy4AxAeI2Klx07IAJGHdbslHqjsOQeQ8kL-M.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ddyTG-hsKIapOx6YESByMd0e1WaWHf8GFZr08KqG7HY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-CHlu8ux0XcJz6rUkNQMclTP1uCBZyxw3jSZUSIVgQY8.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5oDDjKrRd963w2-ONnGSPkq9HCWsxMvwlvCHwU2QtHA.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-h13F9d2yuRweXOfz4fsu18Flaq42ELztGFiWYM6o6IY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-KJy7QiSYQ6vr2mj_FdJKKRQWOyWBimrJHm6_XI2YJPs.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bpnVUet1L2gi8y3_Ydmhi_7xMzqW74CuDhFJEjpccgc.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-bi2BMEIj7PFXRJjIcKg4LZrDdEt_-l1d22oyCPGHjzc.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-2Z-UCSTeqUF94QQQWu2afu8ywoOv2FijecPtnX-LNkw.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cIxsUhg_hpbWGuTX5Wn_i9olLHLDlMCRG5lsT2uXboo.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-I1VK8wyM4pZCTHxcqKjlfIP1_d-DTF8nS54C8HVrqCY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Vvi0Xci_EyQiBsuuvLduezBHmS2tQhvR7ARJMVO4SpY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-X_8M_xUvEumode6m6gLbERaMxghvlQ6W40549byZ9zY.jar
    Sep 15, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PFsr9f1-wh8OsCbIn65Yn2I4rJIPbj6-d6IoXMV2UKA.jar
    Sep 15, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Dcg9MEyeaPYGYEu8ir_ao3mMj7t07PFITGH0YBQnlVM.jar
    Sep 15, 2020 12:49:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-0KPLn4w3noOcD03rNEVNkrAjN5DisZ9CShdbtQh9KmI.jar
    Sep 15, 2020 12:49:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-BE2NH_nwq6CaQpVp4wOj_Ig3_Li4SLgPVf2hbi9xklk.jar
    Sep 15, 2020 12:49:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-etoX5NfcbgP1WCyHbjnGGxa_66K1_XnUQ1c1zzx8Qxw.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-0sh0Ps_XRGaFSZgTcKnY6_VDsh1FGGvQnzwauqHwWTM.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-q5oL7thRYgSraII0xpTwlPhdZKMllozIHQo_3K34zto.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests--JXO7zLY3l_CT_T44L69NcmWNdZkojgGug21TU6YT9M.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-PytO0hTU07CIfKpcEJKkdjyw8ZXDRVsoGaFe8kL8mqA.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-nl5a7ZCZXdy4dOeXceoz-qTguAVji7AoVqLeLrztZ-Y.jar
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-59O6PWlce2SSOR0AJPHCcH_wrY_u9_FAs-NSGfZk57c.jar
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 7 seconds
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 8acc55e6887ade0403ec5aa066b4160f2872a3fd87b85319cfbecba32bbe5448> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-isxV5oh63gQD7FqgZrQWDyhyo_2HuFMZz77Loyu-VEg.pb
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_17_49_12-11314874561782367437?project=apache-beam-testing
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_17_49_12-11314874561782367437
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_17_49_12-11314874561782367437
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T00:49:12.938Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:22.937Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.109Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.163Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.191Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.265Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.294Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.326Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.363Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.732Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.798Z: Starting 5 workers in us-central1-b...
    Sep 15, 2020 12:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T00:49:38.701Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 12:49:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:54.326Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 12:49:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:54.403Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 15, 2020 12:50:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:59.853Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 12:50:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:17.008Z: Workers have started successfully.
    Sep 15, 2020 12:50:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:17.042Z: Workers have started successfully.
    Sep 15, 2020 12:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:50.312Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:50.458Z: Cleaning up.
    Sep 15, 2020 12:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:50.552Z: Stopping worker pool...
    Sep 15, 2020 12:51:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:51:40.809Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 12:51:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:51:40.865Z: Worker pool stopped.
    Sep 15, 2020 12:51:49 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_17_49_12-11314874561782367437 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 25c1ffd3-4970-4234-98c9-34f63866e5bc and timestamp: 2020-09-15T00:51:49.186000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.742

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:51:49 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.058 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.081 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 24.535 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 5s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/oydc4ulyp2tci

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #995

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/995/display/redirect?page=changes>

Changes:

[qinyeli] Removing dead code from Interactive Beam.

[pulasthi911] adding twister2 quickstart

[pulasthi911] adding twister2 documentaion

[noreply] [BEAM-10463] Fix minor typos

[nosacky] [BEAM-10886] Also publish build scans from Github actions.


------------------------------------------
[...truncated 279.46 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Gff2SfaJfWyqI4FCc1aunl2cTpu4PI1VslO1VrjsFws.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-fd3evTY5oifDUVeSaOlV4BL8IYP0uu5taEDpNNKdB3A.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-9E_vyQI09fTU0d29wUoONDyKGrbEiXoavf7aRx9zF5k.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-K0wqBGcyY46bBSva-wHL5iQjYCHBgDKqIJ9qTaJhtQo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-EZKV-byY2feyfSg_4yCb6xYRuyGENWo1kYFXMLH8VSg.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4445160172037073909.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ir-EVZavJqUEqsTD02seIpBtZ0x3AYYu5LFYCppx9l4.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-RjHmqByzOKXc9djNjBhIgxGI24TvDqf43PQIEU4eJuk.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-aj7MKwWJU-IASe47AxUgfjPGv0PvpzfmrV51kzvZKUo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-jWnsEvZzhpFlZ6LgTlmyWLrgvgKot1TnxWvHMkLjlK8.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT--3KFBXiX-JsrokjCauimoB893cO4_UdKP8hoPKiWXOc.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-5c2oB6MdnRcr-IRjs3UOiTDiQmBnXrfq1XOwthI6aHM.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-tvekpeFJPJy4ScFY99Qh5r-qMgJGmSSv0ayKaBmGG88.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-W42A8nyb5YbfGGL-JjU0pNWtAqdOa0y83udcQhRrqMo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-aq7LqsFmHj851jLYZw46oUsSYVEvuU_0-HZVP_TzqSg.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rZNdxYu4Su1qUEJQ9r0b40ptejo5OURgW2T_2rBifm4.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Gff2SfaJfWyqI4FCc1aunl2cTpu4PI1VslO1VrjsFws.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-FATCX34Tdd10TlSsHH2DIfvVsOpPV2DKQLko7TXDLEE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-j2fHYhOWK1YrtrTEmlYX3McZaQxxWCigb4XaIrEgp3g.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-kdj-1twrmLgUSU8X1DoGne6oTLs9alRnE31wPZsLp8Y.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xBcW7Kya05FCf7WAWNOZUhURhiFKccpXTiALrT0OgaE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-JxLiB7qB4WgL3pO5RyC-_5BTo88b00RwyKQi3HLgSjE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pv-G1BjhO8IV5fNm962uYRRM3SZmaDacJC5qyOcbHMY.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-zXYWJHvJW2ecotWWHqLPrWGtQ0J5jrlea2qyeI3_Pxs.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-rJLQ5NQB1HuBg-4Qzqm2uAn_mCUlmXCAsuYCoz40cjc.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests--FXOqtJfto7tC0TKRBdVq7CCZGh7qg_zfj8RgrOkQY4.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-AqQNdsIqC96STfaD_nEuUU5VkcqEE3xDZaxLofbEBJ8.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-DDH5wVsG9_UryBaH5Yew1GkymzfRqUe2NK0FML-PLOo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-HoX1xb0jPRBW9Ns2pGsAgrgPxDJ1L1bXhce2p3J2mjY.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Ryb25v2NccvnnjeKC6AxSQXiijU8y5D86XwkRVBDPdE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-T_KQdeASX1LVkikktpO-LJ9IdxWy0U2YVBaaFUsNcoU.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TwxNPXNqHPDjESCHAhKiIs5N6w9AumHGLQJ79nEN-W8.jar
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash f9fb95a53c419314816e297f8d763209f4177e19750e7d37e0b25a59611d9bcd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--fuVpTxBkxSBbil_jXYyCfQXfhl1Dn034LJaWWEdm80.pb
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_11_45_23-16211709926831709665?project=apache-beam-testing
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_11_45_23-16211709926831709665
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_11_45_23-16211709926831709665
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T18:45:23.888Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:32.281Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.263Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.316Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.356Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.425Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.453Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.483Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.520Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.952Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:34.018Z: Starting 5 workers in us-central1-a...
    Sep 14, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T18:45:48.658Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:09.919Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:17.741Z: Workers have started successfully.
    Sep 14, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:17.773Z: Workers have started successfully.
    Sep 14, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:57.387Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:57.555Z: Cleaning up.
    Sep 14, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:57.648Z: Stopping worker pool...
    Sep 14, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:48:05.733Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:48:05.777Z: Worker pool stopped.
    Sep 14, 2020 6:48:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_11_45_23-16211709926831709665 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 34ff9828-36d8-4bbe-b18e-ab6130313b83 and timestamp: 2020-09-14T18:48:14.218000000Z:
                     Metric:                    Value:
                   read_time                    21.794
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:48:14 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 5.783 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 58s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/bcvccvp6tbbiy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #994

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/994/display/redirect>

Changes:


------------------------------------------
[...truncated 281.23 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-CxpnhIkJSbOTB8vk8xJOsRwXSgpSu8W7LbIEK4toCTA.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-DzgoqJHqh4GW59AsaVySJtXehj2A6CjeVkFgrPQsL2g.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-39Wa-CWCsU2du0C5-vLVUT3rUKfZ-rlv21S9v3WO-m8.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-sLfGWJzdJEMnknSnoYzAlub7Nj3OTXXVCs08rBksy30.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-azRmsRR6AdTU3ho54SvOrInjRv_HSP7SmkYU1ZxCIZQ.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-DbD-CQ_ZZPY74RHvm3wOXAezEvjfBVAgFodU-sI6l_E.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-llJggq3rB-e2NSRVq6h7Ry7ToR8fs_Zjolb-8N_aADU.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-kuEBW3uBDIxv6KFmSbqje_2-FWKFUTM5fB5HA3myAOI.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-CxpnhIkJSbOTB8vk8xJOsRwXSgpSu8W7LbIEK4toCTA.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Uf6vrC7ncpZ_2q9qQyRP0tKYSYvxqr6sA6YG1jNB18I.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Iednw9vCH3TnIoYtJkUJ1Uo8fWUddjydJsr_MIs_3AM.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-h0pvrK-Aw8orC5WlxViOnR-xpLZcGsNERCveOOBY9K8.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8990540214586028926.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LzAsT6I3jMDFJo-s9GwEgpsDtJRUOd-FIaa2x9mchgA.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT--o18nBfiOehUHVHwcS-tGWiSus-xDKR-MFwnYBxLmPY.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-vQXf0kFtaXlzvubwa6SKHpEsshOiOHe7C613SCt_BTM.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-G63_nmdO8Dm5No1TmjHwki-0-dsiUvhG2YE2GV1mUVM.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-FtNOZmGDuCtTMlw1TAZ2gg05GyjWuHIMYltYQuxSREs.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ndhjz73sDx0-_-63Dtg8BqIA9Sw1bCQWzeB2aJYmp-E.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-4DFWgJMaOAZz8QU7aMS9I0eKrWB8WNtOYsKfCWRCLs0.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-u1y-L1iWZ3Zf-lUg7zx2HOEDizUPBWmOqg6p0T4gdP0.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-qUDJkWXrwbtn_nSfMipRq9ySWbuaNIVKCAwr53YuMyQ.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-xSBDtfxYCJ4vQhwnG12lbwluBmeT__2ZiOm32cCFmcY.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-vcBy8cQP-RpphCAK6y9Dti1btBPCegXw3v9DsYgbVKw.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-PuGMbzem6fWB_X7iCsEodbK1RqaTjqBGyPFjMsNWaYE.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-43n1xsw_9j1-J04CQt2JsMPs6qpqDUl6MZvbKsIimq8.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-2P3eAK4LiRpKuWR_XMbVhpYrlI7Vk7tkicwXhw_OOjU.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-bN6OY1lgBogbtXYcI0G3Xad_G_ndVm56msDMONKeEzg.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-lj8F-Vc2me6g6BR0iPYqDUEPSwMbmCXTmw50Y_1g64U.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-UpXRq82RJqjZCxQeGgShVLT4TOWVt4Dmk9b5db9KCos.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Z75leKgHkjdS7V1WuKQuaUKIbxxF5Uqko3WEEkvyf-U.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-vyNJxwxjFRw7bBPlVpq440HMe6mbxMP4Rxm5ep7id9s.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 1bea30a6268920cbf9a870e547b547b77182c23db4665c12e049b8b02238bf1f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-G-owpiaJIMv5qHDlR7VHt3GCwj20ZlwS4Em4sCI4vx8.pb
    Sep 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_05_45_38-2637743576305468908?project=apache-beam-testing
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_05_45_38-2637743576305468908
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_05_45_38-2637743576305468908
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T12:45:38.131Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:44.637Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.405Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.444Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.483Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.561Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.587Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.614Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.635Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.979Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:46.058Z: Starting 5 workers in us-central1-f...
    Sep 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T12:46:11.460Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:11.862Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:11.889Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 14, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:17.282Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:33.064Z: Workers have started successfully.
    Sep 14, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:33.102Z: Workers have started successfully.
    Sep 14, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:05.621Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:05.759Z: Cleaning up.
    Sep 14, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:05.847Z: Stopping worker pool...
    Sep 14, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:48.217Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:48.268Z: Worker pool stopped.
    Sep 14, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_05_45_38-2637743576305468908 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 18f7b8d1-72ce-4bdc-8ff3-7d1cdb109550 and timestamp: 2020-09-14T12:47:55.482000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.372

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 34.229 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/miln4ortzmdoc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #993

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/993/display/redirect>

Changes:


------------------------------------------
[...truncated 279.50 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2TiDJRJqpZftx7W4o3uXh3T7LNchuzA0v1XoGGUiM78.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hXijY5fbGHUzIPuzCtrVwXrlmpRBcOgR1JhqXVeyCWA.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-nKbn_c_c0dx9jPxNJHBcKFCEPeklLSdupPTiPkEM0Uw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7357824017053594814.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-U2NJ_70DbSif3JGEla02Ye6-Ua-lXoSugNUu2kl9NL8.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-P6yMXXBewN00uFAF86TK2WIVyv-NBkrxXAufpoug3Tg.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-nW-vJAdAuY5gwOb5m0A9jg4WVutC-6-bFlzVVFl7Owk.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-CYEKkTqfjLnJ5LJMkn6ixYqLijv0oQz-Guyl-8nTkfU.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-VZ1PTDAm8CZS3RO9q8ja-K4yhwf9mk-wPdGaj29pLb0.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-HcrTDY02lQ7pGWnUnZTy9qt28frBR1Evop9Fk-sbOzw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-gCxbzM7UIWmBWbh8DoWZphzglpNQ3Q1y8YDTyxIHr_4.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-j96WeB2aX9urpVdK2rLhTVcLBoyKkDvOoTGiOea22WU.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-12VKpNDhgubLbwr5eoFB-WNco2kBEdC08MeVxr3YqBk.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-g6h7cMG_smhszM39KEZYSzC-kBJtiFweL4D6QLx03zw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2TiDJRJqpZftx7W4o3uXh3T7LNchuzA0v1XoGGUiM78.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-m4ErMyNjgI3f9eci8fv9NSOjqG0w5J7NV17rChHivEI.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-oVmfMM3kpJUAgek8fbD35StAMBQXARleDHMqtQj-mNg.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-DNUWShArr-N9yjTruHfqR0BTqxAz6qgIgVBtSjSLm8U.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-fTdaIsGPLtOdNpXs3quJ0UqREGBne_siRRO3SOIkvQw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-0wKuAM8TPU3Ve29H9125xvS1VfW3Fw8JwjgRIjahlms.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-z450xTt3aAd-l0IWgpMT3Ok3-2EnF5zjxNC_4XX0WPA.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-fOVdCzzyP4KWV2L-rSKbNTP1vSg9DgaWp-jdvTm-_P4.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ySpJiGnYJLP2CP4EfTEJiQJwQg7CjCXUzHG71Tcbqus.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hDTtL8yVb2Gt6LlSxqNxZprawGVgGcjqryMHxIL-P0s.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-pnYMXou3b3FFmFvg3R25b4O0ZnKAkzih2JX5pRO6Duc.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-d5pOGQ7BbfbwUi4ZZFBCH7tqQcVdjjPRC0PuuQnUvXI.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-50YPx-z-rZ0L5zUu7RSNllS5pI426Ze7L1D9LfhihzI.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-oZiIbXq5H-sB1WXndDYE1nzysHMX5BjcK4PlNcM63mY.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ZhKm-s6R21aEJMvHewBhGvCaLtMAp-2IM1saEl13sU4.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-DbvvZGMFwkwawilRMGeZbWuLTvx3Fu_rrC49gaXchtM.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-R2XJ2Ux5Y6pIndbyvR50HCIOXYhU8YE4IZWpi8ui_Xg.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-J5kMTDV1lR-RsQQZE8MyzO4g8wJkniMWP2cC8KZa25c.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 31 files newly uploaded in 1 seconds
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 364a3735579105dcfebf2d5b44e665a34aa0492a3e6ad78e08c1b7bfe805bf82> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Nko3NVeRBdz-vy1bROZlo0qgSSo-ateOCMG3v-gFv4I.pb
    Sep 14, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_23_45_17-11896394134789602595?project=apache-beam-testing
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_23_45_17-11896394134789602595
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_23_45_17-11896394134789602595
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T06:45:17.132Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:24.884Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.770Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.811Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.841Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.908Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.944Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.971Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:26.009Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:26.489Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:26.587Z: Starting 5 workers in us-central1-f...
    Sep 14, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T06:45:33.960Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:59.124Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:14.950Z: Workers have started successfully.
    Sep 14, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:14.990Z: Workers have started successfully.
    Sep 14, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:48.302Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:48.426Z: Cleaning up.
    Sep 14, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:48.501Z: Stopping worker pool...
    Sep 14, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:47:35.627Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:47:35.671Z: Worker pool stopped.
    Sep 14, 2020 6:47:43 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_23_45_17-11896394134789602595 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1aa485cf-57b0-4163-8c16-e91f7f519daa and timestamp: 2020-09-14T06:47:43.831000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.548

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:47:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 40.545 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/qjqtdtj2f7uww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #992

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/992/display/redirect>

Changes:


------------------------------------------
[...truncated 279.38 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z9abmQLtwvYbNOhNIXYBghW-wiBMzrvqUO6VlA0Ftys.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mGghy3JUURVt7jCOc31uUzyK67zTJ9ejarngJqkoheE.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ChlpxMrkAx94DNYEGa9mjVwMYCSScYiTfa0vH4Jeh74.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-6cMjb4xH8S7Y7c0m3dfjANR-_OEGKeP_5Zz5te_hZ0Q.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-6NDgkGbsZHEN_Mfaq0Az4Lf0HkrWfm-cCVdFFV5SxBs.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z9abmQLtwvYbNOhNIXYBghW-wiBMzrvqUO6VlA0Ftys.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-oEMuUZZ7sYEqd0cJCdHBjpk0Z5XgaVpN-jMp2z6zJ0E.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-zTbek3XabzrNRI88ki6k3uah5p8PAgK2RMybkQzYO4c.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-pl57gdIyXPnk9nyxKT1nQTmEYIQRDZQAte-3DrJ_UeQ.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QuSWX2PtjrDwjZHTH0MnDvLvtP2syBwtQi7bdieJI6M.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xbKAVgSdWCwZJ6o18e1ptg_8BfD2rk2ZN1OqKL7wMro.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-rGBqcJ80l9wxbIBHqNqbpQR8CJ0VqFCptYKRn41CR8w.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-PSiSUCj757Q3kN97593dEm6Whg2Ol-pnT6FoMzFgwl0.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-hOm3pTfVM2jh1mI0bSy65Y9iKl3dyTg9Tc7ipwg2Xmk.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3906649559536095733.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-P43mHstF0Vn3cOMxP4q8IPMNodbniT8zVHQfbu6cEOI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Ze6CBFpEjaqYvLJMhLlOQ7lJbdvxpo4Is8y1dF4IO5I.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-xDIuE4m5k8jS48CHFZETDBCD6Nf0MfwqoWmglXDwvNo.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-aOklLejTZAwn-uulKLnuuUw2Sy9qBf5ci3RyxU4T6fw.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-5XrKoqwKsI_FVhoZuQ_wAL9CRFinPXoRC6r-g03NxZM.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-9mUEZaAXKhwFHrMXEqd4-tLTFSP4s9YsBXpmAnxukz8.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hCEV9o-NnwGsUOgkYOuQ6roWsEIfaw-bEAcGkPJLXl8.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-cazJrUePQToOjy9u6d633lx67brCGD6CKm3nzpohsOk.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-phpMU5TwkEbn6J0lmHY4pxhdq1rlmvc-BYE7d8vsDO4.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-L_16y7CEte27wAOUGfsHBSUzY1-6zO58nb5ygjiNTsE.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rdVX59bABzMJvY4BtQ44w6P2el5qNUoNCgpCM9p_ftU.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Hmc1BTyhXc8sO1W2ILm7Iap-RSnW9feRfAtUzJvbmTA.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-85k9i1FCp7eey_VhWK15oVX_lWtftzEBEQn1zQr8CPI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4Dg2bR9-Oz9F2IwHrTjzCpg6iETVfiPxkuaI_lV82nI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-nYZSkNl_7oDeDt9LR318__Asx7I2d0B2ILh_wuJ4I88.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-8cC5rTF0vkVmGbwfccRV_LWLEuzP_Us7R5VvlXcwHkI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-dTKXrzh50rMFiCJWZPSGa82rIj1EF_8rmpCWOT0yseo.jar
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95396 bytes, hash c727f1f8c66e1a5dd75af46828f28b374733ece84b418e0a9776a8939b947216> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xyfx-MZuGl3XWvRoKPKLN0cz7OhLQY4Kl3aok5uUchY.pb
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_17_45_17-16012402092642319127?project=apache-beam-testing
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_17_45_17-16012402092642319127
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_17_45_17-16012402092642319127
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T00:45:17.703Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:26.616Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.812Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.850Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.882Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.958Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.986Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.010Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.032Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.402Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.482Z: Starting 5 workers in us-central1-f...
    Sep 14, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:56.229Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T00:45:59.815Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:14.143Z: Workers have started successfully.
    Sep 14, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:14.180Z: Workers have started successfully.
    Sep 14, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:45.899Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:46.048Z: Cleaning up.
    Sep 14, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:46.123Z: Stopping worker pool...
    Sep 14, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:47:40.149Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:47:40.190Z: Worker pool stopped.
    Sep 14, 2020 12:47:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_17_45_17-16012402092642319127 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 74a5efb7-ef22-4325-a537-fad5d78d2688 and timestamp: 2020-09-14T00:47:48.145000000Z:
                     Metric:                    Value:
                   read_time                     14.11
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:47:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 43.721 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ad6xihyqivdxc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #991

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/991/display/redirect>

Changes:


------------------------------------------
[...truncated 279.77 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--RpzHxJNG633hQ6lRU8IMc__-srXDPTr6Rxi4jee9h8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-PklnGxcoosB048fUFrg6Pbf7T7Zg_xxzCCuuvaWvwYg.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-BOzzy5YU46WU9h_aTmD60nE1tXaqhF4hWn7-3CsH0M4.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CNV8Py6ecNrsZ2WUhbIUR_T-BFSAAyTH1BUqhP5qVL8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Eu_puBqV06gF6n2ASMn5wApMPmi_DRH-LWfiyuBJ_5c.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Ab6Pz8AXfFndmiF3Wf2ZOZt-6_0CfkLJLplPKRfBf1w.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-aFagGdf-53qDJRsJ05-i3RPaqBb3d3kZegSkVLyT3lk.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Sv5_gYVXrO5MmpwMDIZd1XpwjGWmX-uj-siJzsw74GI.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-dHghPk7Snrfq_KwyLs9rEeqe-rPundgfZ53e2_HV_hY.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-FcosGwIDcMKy0XNH5f-jreMUEIL60-A-W0Bdd_LnoA0.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-cXjTZH3YkaOuY3-N3LVQ81eLTLLoa-O6D5TXISqTGEw.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-cynQExj2tz6Uo2er7EKARxJdUBqOjyuVJUOrF7cuMig.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-32eFoBJF9kHop5CV8z-801haFO4kv9wNwwmyvLFVgfk.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-b7x6rwUslYBlnY37Lh5qFHdeJ_gq320FGlrUnemcVi8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-JQNVPwDcYCxln-U3v-044_i-1eDquZRl4LI19gZY2EM.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6913594441642094934.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dDUwAcsRDYpGgw3d0lyNQt2hwea5zLQFCC9PEQlf1DE.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-5VkQ46ypHk9hNLTtPC2wjTzWt8_XmXBUDzPlV6_RGvw.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-su5ZiYSRQzHnXfVbGzPSeZ-QOXT9pj1Ic2h3yAsWYfc.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-lY3kDUWHr27CySwduHqc_PVzo8kq95JykciO6Qa5reo.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-_0krzO8cyf9EKOlA0Mmq3kQreYPa2GefoQKsD13-BVQ.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HWK4KWxMIxFCn9eaCr7T3bfJM_k82njFVaKumPna4v4.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-CMd6Ksef87T_Q7XCIwZ-oEqQOaZFRL7eNcZyAhtn5eQ.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ezEk5HRzOOmOavPfXYvA0nXgkAvO3-jKaA8V-nvHzEc.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-7q_bbIubCLRw3F5to4-fk-DhCxNqwYlqBss0tBFmnnc.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-u6qyT123MU1fb91tJ-dKhgka2dMzQqlw1JGRFrKrxH8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PiXEQwkXQL7xEDkaXVQXPmakGnwEDnXIF19HG371ptU.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-BN6axB0kPFfH8K8O8G4a0QgxMT7Bx4upohImPis10Qo.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--RpzHxJNG633hQ6lRU8IMc__-srXDPTr6Rxi4jee9h8.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Tzpv0nKw04enEscH6qF2Gx_76_GAjdgS2MUsWaNXNS4.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-uA1jtCw2-j-QC_u9KywFWq3da5I6ROmRhgxLJ6QsCdA.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-AZEDRqMjLGsJadTUox0-ZjmzMxyNcDMgVYyf_NUQkqs.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 665fdec223cbc7d44d8f9714488dbf3afa7e360047bc0ec5c3a710d3b68e5e7f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Zl_ewiPLx9RNj5cUSI2_Ovp-NgBHvA7Fw6cQ07aOXn8.pb
    Sep 13, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_11_45_21-3384341224961832294?project=apache-beam-testing
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_11_45_21-3384341224961832294
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_11_45_21-3384341224961832294
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T18:45:21.190Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:29.584Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.202Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.266Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.300Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.378Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.403Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.438Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.480Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.938Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:31.018Z: Starting 5 workers in us-central1-a...
    Sep 13, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T18:45:39.745Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:46:25.387Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:46:44.646Z: Workers have started successfully.
    Sep 13, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:46:44.682Z: Workers have started successfully.
    Sep 13, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:47:23.129Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:47:23.436Z: Cleaning up.
    Sep 13, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:47:23.609Z: Stopping worker pool...
    Sep 13, 2020 6:48:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:48:16.121Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 6:48:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:48:16.191Z: Worker pool stopped.
    Sep 13, 2020 6:48:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_11_45_21-3384341224961832294 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c31e2bfc-d2b3-42c8-a49f-5029be3e958a and timestamp: 2020-09-13T18:48:24.184000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.321

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:48:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 17.509 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7ilcfhmp6ibyw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #990

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/990/display/redirect>

Changes:


------------------------------------------
[...truncated 279.60 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ul2meUH0IQCe2yUPARZ6hPcjIVGt2QKcM4xmXmDueEo.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-LC12r3Miotzk70lOxfN0D_fbyreq1hCPmYjAUyycJCA.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-H1Zn7zuNlETPuSOLxqbd4gYYUabEr_BTFI3jtArKaA4.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-GOC3k4S4HIVVvKfFTIYluNZiHwYHFQX2oX2WMoOktcA.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8758513969827789827.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dZKuIgkJYSQslwOIbML-0jPQ7MZ-K1imX3wpxD-UrPE.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-YnzkouMhxevqpL8dV1KEUgVV-9LIJBWaUyz-w1x8y0E.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-QZkdnvL6pMPy3zZC_umkdrs3zL72idn6iFgx4hgM-Do.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-tmUh938FfAdy-aj9QOt1hd0msh6bN_M64Nu3WH4Kqe8.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-akLsiCKhG4wbqngMrLIqG0SZsUFEimLyaYRiD_g5TIc.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-rRnslF_lQc6J2u3Wg0IIyoUUH4eC9W_ECzkWe9VIe7M.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-pE7LH1z7fkg0mEchFNv7ADFu8aP8GGKK0veqIdV0ARE.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-6zwVuzt92iGA4LPCXj_imSjAu5Z0aT89BwsqXiK7YBY.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-MTV5SSbnsxpVH7VIDhv7qAyagOArS-sdlxMU4yQ9hxs.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-tABhqM6q3BUbrcutC3OPLVLjuPITAbN0IL5woLb0RvA.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rE3CMaTjBzCrUI4kBi8XxjP91X04ajRn21VzkMKzfCk.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-gWIbaXmJfyJTXeiDIKZdR436w9haA0YUmDxmxCaXMk8.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-kApSY8Xf2qUr_mVEYOjp36W0SP1mDE3alt82yHLxQoE.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-qcfO1vXyxgrNIzVrOsfI-IytaFmnYQqHN6ELka10iog.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-DqdpSmhWXaOncNwcgEtYj2nKkwH9DK_3BW1HzaWpVn0.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-DhSx4ER-r1klHjNsKasC2L3CovV7jeP4MCsyPcVLeSg.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8MYvL54LO6j90zyIMck7Jr5MkVG_iiMc48zOzoKRedY.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8MYvL54LO6j90zyIMck7Jr5MkVG_iiMc48zOzoKRedY.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-q7Rf1zbCsh00synwlzjbAeuBCPiaqJF_CpfuzQJBQ-g.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-k4jYjY6VeLyq7-19rk0ALmbaydU7Wl86yWiJ2amnK24.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hHJ6mSE-jlLXmgL4fjPKeF9qXDh_078PfqXRYRBF5I4.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-bC62r1Nvo-ZmE-1LgPWVhiTyIpb9VBmpnqsAz-y8TH4.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-BtrLVcbVw089KAbnUrAhl4DzziXzdDaWEGRx-BrpBqM.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hiq_e1FchSEnK3CRRz688NBurhKAR8BRvL0VQPcQYYA.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-1H7AWv5ndMOSuSgMGaT7EKNfVasTDQHiwCm2hn-tkwQ.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-PwVaHEtLfvJlGtCh8EgvNLjFMVV2Dt0yREt8-r6CIE8.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-bHIy6tcmJ50eeeoBxI-11fIy6YsOXXeVS1dVx22-Bp4.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 6cfd4c81e3c87683f6fa8df17a4bdb9d96b2a88efb77df5842a2605e202b4743> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bP1MgePIdoP2-o3xekvbnZayqI77d99YQqJgXiArR0M.pb
    Sep 13, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_05_45_14-6935812336081207557?project=apache-beam-testing
    Sep 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_05_45_14-6935812336081207557
    Sep 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_05_45_14-6935812336081207557
    Sep 13, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T12:45:14.327Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:21.309Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.033Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.065Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.092Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.172Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.199Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.231Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.263Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.593Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.662Z: Starting 5 workers in us-central1-f...
    Sep 13, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T12:45:40.217Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:52.178Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:52.204Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 13, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:57.643Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:12.580Z: Workers have started successfully.
    Sep 13, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:12.613Z: Workers have started successfully.
    Sep 13, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:43.660Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:43.794Z: Cleaning up.
    Sep 13, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:43.877Z: Stopping worker pool...
    Sep 13, 2020 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:47:34.029Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:47:34.082Z: Worker pool stopped.
    Sep 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_05_45_14-6935812336081207557 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 527837b7-ad48-45b8-91b0-5136175abd7d and timestamp: 2020-09-13T12:47:43.151000000Z:
                     Metric:                    Value:
                   read_time                    12.906
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:47:43 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 41.731 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/sushkpb4mhgdm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #989

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/989/display/redirect>

Changes:


------------------------------------------
[...truncated 279.13 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zIV1ItT13P0dLS9ytEImapkMPpr9OUBPOqkpx7a2eP0.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-RW0X05wCqxOYqBX4HKP5HNHVbLGef4e3Fz2Npgh8NKc.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-lQaCV3zUXE9ILN-2VqrHsFj47-zegCG5YShcG92Wplg.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-B9lyWrTPZ4fgHPTj6B_NxhO8VE9mk0s7WSaueS8RLtU.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-0wy5F-Nwxr5Ekm0u9d1iKYS4vsr-hOo-iVR3Ry8NoZc.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ln8LRBdObK2cYmKGYULZfce4Bl07Rokubu5lbj75Qzk.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5409813205407147310.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LCx7FKZn8EznAFvuQQbe0NK7ersr0eKz6cXIzmaygpk.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-tHDQaGgDz6XMMKepWmdMGOtRb28LM9FL66waW0bjHOo.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-S4AV1RZD3fUf6EzFIHU-DhWjrkoGa-YV26Rk1CZJACA.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MBKxaDAaxjOYXbqolb4JvWYkIvOqluiWS_9xbMagIDY.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-THxGf0hcU9WKJebSyDzSLCLA4FfamW479y-j5ddvEpc.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zIV1ItT13P0dLS9ytEImapkMPpr9OUBPOqkpx7a2eP0.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-afIIvq9JjxkurTfAJsZNhCEG-LQDhnlia41k9mZ_u4A.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-3nrAf-ljc-rQU7m-pndYCR_iyPaDKIvDWAFyAyg2w64.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-tXiipUOYUqJFyQjzFiKTbjNFNaW-w3tOF4byaBE9bFw.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-NskZkMekPglgsDOO0ZpqOHT0vNzRVXmTHvtm6aQG5so.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Krm9RkdTfraR6NNvJfx9a217BXszqlH5w4aq3l9wG44.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-vMbAs6RvtMCnxOfuqzmo9sPY-sohSNRM8tSWv7_zyYY.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-kb0nlc2IOW72Ejq6pHqzHWHZVq1FiLWS9pwqO9Ww8SY.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gIu3PxU2NP0fN4HpqSY8YZHClIsX4ADJRJ8w7zoBgJo.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ZUkscO8PYsR6jfFaWUM0CCCeKGOTnyCWnLRty27U0UE.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-cM5Zu9J6JOYRD-ncyztbRLHunP0LrNMIh-eeH5XSKmA.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-v1qLjiB9b6E3OHkb3NTYwXYVJpREIL-7ksa9-_701r0.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-5veijYU9SXyj_Hc6GwoVCsQwUA1KPIW4cI-KqZQTR4w.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mATFtbd-r1VcWjhoPESoSgOEVSH6mFMglCuXInDY5DE.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-LDQfxQSwJ5MvLUUi8f30TJWYkWHvEKWwLJAS7-YZSkE.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-IUz-gmo-5kS2x0dL1lBeCPt0MqLi0P48hxg_dVd69RM.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-vWTfM6-DXNI9E4BXdV2A78ERUXjgbz9qYok7zAxf8Jo.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-s8WyqBV2rTfljzyQcdIU9MQPzwpuHROhzADp-4jOe80.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-0LKQitoIub_1pjxPE2P_RjzeK_HkE3dfHDPfvSbwDOk.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-7Il7zTQpQvycpdio4NCShPIUO1TJBqGIsxehvv3wksk.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash f549fa8c906b1f44b49dd31565360e2dde8d1ff5090edcb01e55c17a77053cce> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9Un6jJBrH0S0ndMVZTYOLd6NH_UJDtywHlXBencFPM4.pb
    Sep 13, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_23_45_18-3668877861938299411?project=apache-beam-testing
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_23_45_18-3668877861938299411
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_23_45_18-3668877861938299411
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T06:45:18.238Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:24.567Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.399Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.437Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.466Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.542Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.560Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.596Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.629Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:26.061Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:26.130Z: Starting 5 workers in us-central1-f...
    Sep 13, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T06:45:50.314Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:52.865Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:46:09.821Z: Workers have started successfully.
    Sep 13, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:46:09.856Z: Workers have started successfully.
    Sep 13, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:47:44.208Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:47:44.338Z: Cleaning up.
    Sep 13, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:47:44.434Z: Stopping worker pool...
    Sep 13, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:48:37.225Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:48:37.274Z: Worker pool stopped.
    Sep 13, 2020 6:48:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_23_45_18-3668877861938299411 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5ed4b5eb-ecc6-45e8-8325-d9bbd7ecfbea and timestamp: 2020-09-13T06:48:46.413000000Z:
                     Metric:                    Value:
                   read_time                     11.72
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:48:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 41.732 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 30s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/clotlh7cmjyb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #988

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/988/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10861] Adds URNs and payloads to PubSub transforms to allow


------------------------------------------
[...truncated 283.34 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-5diZz_uRuWZG_q1Il7CDozVCAppfQDK8T2dLeFthDlI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-PsicMQVhUErgQCGuw21uDfF6jSXXtKJaCPYulUA-9QQ.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--Jt-k7dumamrQmhLJNzXhiJzM73jPyiBuZmL4AG3TPg.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-HtMUjouDbvAU9MXDLQcwxU6prA2Vyr9UUfXXksIHWZA.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-nUzjyJ5YZGFE4qbm2WEJMRKytgV5-d4jTyt9FVA-tis.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Dsd0oGbV8-T4RUhKsEVhTaUwrFTFLNUBQWai55VKdX8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-357HhFVGhBkuhVI9RSD0aDbaMfrUO42YKmHKrkW_4Yk.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-LUro0SK7PU48MlBjVGoas2o0EB_-_1SPUok8onnWba8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0hQDuHBb-2xa9q6bP3lou2RjfYaF5kKAs-TJNTyj_V0.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-5gOIj_tY1RD9W4KHYQ3ACCMfb9Ib4UfSGk6gyehCUt4.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-jJHXaUhzXCpOHyfbNI4DyNhjBfVHaWI9JDQKcIt2l3s.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-jhEoARraI2zeDO8fF1aHqOWBmX8BRv7sURWX1Viy-Ug.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-qOVRK4QDV0H9PkLf7t5h9rRmwyui9vp4wXJCvxsiqcs.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-5diZz_uRuWZG_q1Il7CDozVCAppfQDK8T2dLeFthDlI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-3NxfJ64PRjNXPikzivHA4serZNBAB1z763nDQJh68A4.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-VBd__NjrNDUUrc4XIcBPVps7AXs9OjCPTZQshIEJEME.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Znukb20zjBJqH_r62o8nX24I1dEVGLnVN5xjX_zt_jA.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5044388266524388549.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JwngBD1hPWItBiz9vePm9B41dKxLCFCnrhPiryrL0is.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5jnrMmvMGb2gqEO3e7N-_TK8GvTM5NbMMV4fF9JOrtA.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-yvs2boNR8423KvnJ3m_nw3CwRl656AeAjRogEgxdU6s.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-sGDEymxeHpIzRhzcgnVrf2l_yUK6DaGJx0Rj5C5YhNI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Ul447er-TayFIqHXaByOU3N_hWAOjDAzfwk6lKSGPMM.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-FTjHAdyKgcKXZrVK27U0zRldZWqzB_xMaQELN8VGTJw.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-pDDxdYEivLHGhOraPMM1cSTQD443cmIn3jLkmT2lcu8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-kYeB-lyqjvdvPz_I_8n6FoUfBkXdcMGV6rSpWTz2N3Q.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Vj1fCqGwYfIbGHsYwYdfilFOv-JU43KwYv2BAOOEyJ8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-g7OE6gHE8N_YT01ujnuWoM5yS3-CGvVg1d3LQANrMKI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-QH23r_soQXfmHsUh0FW4MYrDLQwDIIdZ_ch_B7rQzH4.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-YxoWbVyYMdqq2kNFDr3wOLo8wK0E_9LFW5OjgJr6cz0.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-UPRhPAx6KcOdZJQvnAD1DeiAD0V3sXuFDjfaxdBO4Es.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-hau9eW65Gd8rrwwLwjX0BKVNLVjWylZ4J_1TUR21DjE.jar
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash c6e8a4483f86b0fe17ff7d8684109547fabd26bf7459236e1bde8f74a5b8d843> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xuikSD-GsP4X_32GhBCVR_q9Jr90WSNuG96PdKW42EM.pb
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_17_45_32-2442042179356823465?project=apache-beam-testing
    Sep 13, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_17_45_32-2442042179356823465
    Sep 13, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_17_45_32-2442042179356823465
    Sep 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T00:45:32.689Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.353Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.923Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.959Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.993Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.087Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.116Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.152Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.187Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.537Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.611Z: Starting 5 workers in us-central1-f...
    Sep 13, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T00:45:56.085Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:46:13.651Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:46:32.764Z: Workers have started successfully.
    Sep 13, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:46:32.800Z: Workers have started successfully.
    Sep 13, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:02.128Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:02.290Z: Cleaning up.
    Sep 13, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:02.394Z: Stopping worker pool...
    Sep 13, 2020 12:47:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:50.435Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:50.487Z: Worker pool stopped.
    Sep 13, 2020 12:47:57 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_17_45_32-2442042179356823465 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f35ccb0b-2a11-46b0-b633-bde6ac702d52 and timestamp: 2020-09-13T00:47:57.815000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.619

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:47:58 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 38.428 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/nuhhjcoofl2yo

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #987

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/987/display/redirect>

Changes:


------------------------------------------
[...truncated 279.24 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RIUc7D-wT-3Xu-pSSnNbaoVWmG8qFsrrUtIlGXCSc-M.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-JOGwniywT7hH4scTaazL5e5CGe6HLQjWve5gadGBatc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-3TDCx_Pcriv3b8nmWAAy5zF7p2PNpVhEQ78_wjtIT7o.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ecM-cjZHsuv-vLAdNMJsT05d3vjpOVRghbpp0xHfhoc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6190156969055888399.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-iPGL1gHZB5hn36EPyFWerbiLaDfXxWsXpgUr3bqUOuI.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-3TDCx_Pcriv3b8nmWAAy5zF7p2PNpVhEQ78_wjtIT7o.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-411spltmxUvPD3u9EX-POIl3rPvOt4_uR7dm-DfjzvQ.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-UpDOCdEtfAQuVo77juDMMclhYOBtHd6f6a1X2O6WTPk.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FT6Guez8yRRjdcifKSGNjhebxe6M7o4hIeTBgI-xZIo.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-mV0N9SV_LwY-3pfTMjTzcwHcuuO3eSNpB3KHNGe84l0.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-n6XOhc2iAq9zOePawdseFU42CWbypMPT_ZElyDV4vHw.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-UNHdoIG4fNDDhzjkm6qJ_eIiwQrZ4e9dLpAsjK6t9MQ.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-xMCTfAN30JbgXY_daEWjMtUOrMNoerZ754VjzW-cgac.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-tQ1q-R8thiOp6Dn5IYHvFMqAeUDdvkdgX_6ApkN9B_I.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-DQmS06U8Fvxm7Gn87IeTb41ANociGGIOkFcgKB3KSAM.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ftMHWMKbwWwxPnHkF6wbAfJXsyIVZsmuOTaU2iNwTJE.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MR0nrPmMOTriTglSOWSASq6pZT5cw-hdETF5OTSoBec.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-4bY7c9djIvf77Ohvq5xwnNHNSaInJFk4Lrcx7IPqgrI.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9hYTxpQnOSZxMvlcUNmuPbrMchDYYptfb_EkQsR5k3k.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-osZtRfKznYFw_JlDkTISSmC4hifh3JChHG2-T13pZes.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-WhmxBKtp8f-C3cGV5e12Rsln9Btl7v_eaQywoKYvKlk.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-n_AXw0VKyif5jyBHKfPrfUbMLLEBUHZDl1WkZQBCf9E.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-OOUvlm1vQloTVR3tnCJ1l0BHpEuFBgoPlQaoAmezOCc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-UktcsxXvOfsKiBeyn0cHIzw4LZrrR0536Ro9fLlvYyU.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-impoD8M75cj7CpGL_PyURf7JL09AWCNSchxAigXnewA.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-bclA7dAD9g3Jj7O7BtKLIrJfqbzFf7nbnrDrgC2Fj2M.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-nsBGxwGbmUzGEAKrRArLP3lftpkZ_afNbesLJApdmVY.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3RO2ob7iNEzjbqzR7r2Xt-B_GqKniiJppug_H3fEuk0.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-YhPkReUb_NrwpSFYgWvYafPWyxQ2GKkuEFDYvJcz-_c.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-q9XYvHS5ZGrt0fWTG77xv9npC79r9Pjt633HCAVnOEc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-EzPogaV70wP_A03P-JRUVrtHfL5MYfGaMLFMry0aO7E.jar
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash b4da06e9077831184b032eee574a659454966fefdd15dff5e55e10331691ed40> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tNoG6Qd4MRhLAy7uV0pllFSWb-_dFd_15V4QMxaR7UA.pb
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_11_45_17-18202781121867523875?project=apache-beam-testing
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_11_45_17-18202781121867523875
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_11_45_17-18202781121867523875
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T18:45:17.576Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:24.723Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.377Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.418Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.446Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.498Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.527Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.560Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.596Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.983Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:26.064Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T18:45:51.325Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:56.439Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:14.063Z: Workers have started successfully.
    Sep 12, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:14.097Z: Workers have started successfully.
    Sep 12, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:49.517Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:49.668Z: Cleaning up.
    Sep 12, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:49.751Z: Stopping worker pool...
    Sep 12, 2020 6:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:47:41.359Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 6:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:47:41.400Z: Worker pool stopped.
    Sep 12, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_11_45_17-18202781121867523875 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1d7a904e-2477-4cbd-b44d-81bd21923c91 and timestamp: 2020-09-12T18:47:49.267000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.634

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 46.715 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/zkm24aezxos36

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #986

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/986/display/redirect>

Changes:


------------------------------------------
[...truncated 278.99 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-z90nMHxJysgy2fVqLL7dSxMRTpaj_DkOkaKyTm5AjiQ.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xjpsY1CDdiNH4GcH0IosGDFe5-fc0BpbWzAYkDaBXDk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-nODW2BB8MfYe49sXEMO_JDDOHJkv5ZvqD-5t-C5COo0.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CJNWcV72-nHwEc6so1QNIPwuBMgdbbrvePhYIkqxEJE.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-VzbQGxA-iPL-QW8hf3xGYCCcBNOLEn6n4Plo2-I8c34.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bhFZYA0yTghTs8BwfiPAw_igEJLTzWIggb-wb5eO4dY.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-LNxTYSuNZ7QbKSkngFHAGYM5-tQaEOT0uOcUs5IypRU.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-3_0ziMRMnaat9w_fu2QFfDZCQWjcqMh68i9al9qPLDk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Py3cehl7iScRQtQHX6YrscKeUxy12A7PoEbe8zKdbCk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-z90nMHxJysgy2fVqLL7dSxMRTpaj_DkOkaKyTm5AjiQ.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4wMWeARapQtYGfuZAmT6avEPNlYRFwfopp_gV2-jPnU.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-g8a4J2UgngcLcnCBbJqF2L5A9ZvPf96pvOHqb94wikk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-V4wdv-LgUm1EjG9OBFD-pNMIFORqO13fF1W7EWNtfmI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-MUoq4Th92I1RO1GjuRCGXo0oQySg97GPRrp2cXIOj1o.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-RpPlxJ7fHCY1KSHW2z66JAeNDzK-VNHaBazOUXU7LP0.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT--norSEYy8lHMxHHBJd9Lg9fOih6hLAGDT-B_tY-TmUs.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-VzvJl3bkjrKRxpDIF4GDVI2OM8Nhyw_-7hRP8yQp_Us.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-tmdEIuqQHxsfrSq96AKHs3FopvuTnJmRN1Lb07eSK7U.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Y_-uV9q90QrmP6dSA3S9h5xtI50CDXKyguWJKFlgmcI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-_YIbCxfA5XOyMVo8kquLO4qZ12fnMRBe370ygweFVFI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xzcRqjAjnRJqLvLaM7fFhb3v5jRY_ltSAzqPHWIJq_s.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6928200970980496070.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VVAkN4VVwgNkWFZPVgeiVTlFsBkPZkxJlw8J3hLAx6M.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-7pr-N1rK0dP8R1615EguLH4hV6Brba2gsrh1Tf_YX6Q.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests--snj8XHUcb4kiXHNuVfyh6Z7QhC-d6mM5HW6P-oUHUs.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-lG1-tl5G4gHXARL5ZlvTEwCV0ZZzGdHU6Ykq564C3vo.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Yo823blVKIewudqVShpCEA4hC9XW7xc9v4Frj5ESu9s.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-VYBl92V6ffH6vF2JaZAiXMvmaEraZbVDF1jgMA2q6DY.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QoY-pks85DnX2JST7eBLKflvwzn-bmdKxDHkzn61JtI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-LvmNivz44mdZAG9a5pJhw9B2xOwU66F3BPOjHJTGwr4.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-cZig0-PTcyW469z9Onvi7DPDcVsILcGnLup1u5RHN_U.jar
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-JhVoQ5LeV5Ln_TeNpduvnX56buwmbUsfboaChT29qss.jar
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 56e9b29f7c8cea4e2786bbc07fdbf248902a4482ff6987356ff4b9025686fa51> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vumyn3yM6k4nhrvAf9vySJAqRIL_aYc1b_S5AlaG-lE.pb
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_05_45_19-16738107090139455381?project=apache-beam-testing
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_05_45_19-16738107090139455381
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_05_45_19-16738107090139455381
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T12:45:20.066Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:28.861Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.622Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.683Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.721Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.802Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.828Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.854Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.886Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:30.254Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:30.327Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T12:45:54.737Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:59.391Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:15.328Z: Workers have started successfully.
    Sep 12, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:15.360Z: Workers have started successfully.
    Sep 12, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:49.190Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:49.328Z: Cleaning up.
    Sep 12, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:49.421Z: Stopping worker pool...
    Sep 12, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:47:40.379Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:47:40.429Z: Worker pool stopped.
    Sep 12, 2020 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_05_45_19-16738107090139455381 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c30848eb-5256-44c0-bfe0-063b522d740c and timestamp: 2020-09-12T12:47:48.099000000Z:
                     Metric:                    Value:
                   read_time                    15.368
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.057 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yi5bd2nvpocyw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #985

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/985/display/redirect>

Changes:


------------------------------------------
[...truncated 278.98 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pn0SwLdkI0pahJHYDBxPOgX1WZ4uDVZYjccQYpo7oeA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pn0SwLdkI0pahJHYDBxPOgX1WZ4uDVZYjccQYpo7oeA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nHH4ftcX39NARRKedGQ-QpjNjYd4a11pfxciViCe6WA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-jO6kSsJON2GXcYBaUa0Tj1Gsbic9VbXIGiCQp5OTgzo.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6718575402158905871.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BimeewIml3XjtJmxQN6uSY1w7XOznK23PThpumby1_k.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-HRvyD_NXuSL1ch2rh1djptd_MzKnU-qC4UPqGIO9okw.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-vJH7mzl1AJV7bcuyyNsUnPzyz5Sr1kutpVZKRUNJwZk.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-LVCDZecLiZUzatl6SeUZtjbifFr4sbkibKSH2v4wRp8.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ienPwK4WHmGnmaI2W5zWYnMwUKc-GzXpnZ9mOPXoCrY.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-pftuOscHaO-bYIyABe4Cgm8kf6N_yF21i3E8OyI-pjA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-pll8XQujkQm9xci3NAiDCanQthD5_p6wb3UTP7tZnhw.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ThKXZH1IShz3FlUXCOcU0lHq7yEtECCxlmEKx0JbDbc.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-znClMJiYPgsaa1MPlqbpSoFgjga7d4s_UlyNS2vPGJk.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Ef5uCdly_r_tyj8K959BRBZGC2j7qxY5jReJGF7SDYA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-gbeMcPIVKrLUtGT8t5XVeYyf26ImW84NjGLoXbtWaJQ.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-v_QUL2o2QxfjsYRTwhs4_Dv9P_Qbb9zKRsQ5PFgNgNY.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-DJSeyeXtpfB3J-QuORA_xtu8adBGltYze8ETKXN0Dkc.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-yM-uNbxr8e_3PV6fkCGDGzPlzdKI6RSzau9j2E-TbRI.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Z5vgNRtS0P5VaUMfWbOZ3kwQYsKdW-y5FvhgYNBI9gw.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZaNL3XsficK2_lUHe52wHDbaztA5Fc6F8op0xv1J_ms.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JEYjAyhR2tyFC7yzbcB_H7SviuT58WmTlTxoApBpuDE.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4bbGL-aDTVtvbsa8AJLKyutXZVyddVbvOPACXqXUbs4.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-AG83rGpAF7JL38S0edVG1YqS08zkx5gB1Tp5Rg0cyEs.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-EIFlkV3SAZ5kO38OMLdXWaicV_bmQtCnHmm9B5ZYb6g.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--MvoIWxENv9JEybVcy6KhN2VtHJYeK68Zcs9qm4-MxE.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2TrPgaLA11Wd0pRyaGoipPZijJfucL_8fEwEQBaHfEI.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-TCQyT7Q7fhsaY5KSEh0wXpTX33tu1xTk4ku1dYN3Pxs.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-5Dr7fPFgXEq03Gv_n-BVCW-xsnmiyk3ymOhsU136Mpg.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-g4V0UXIw0Gvxub7uRE3j2bGghZ2qgFLlVBgBVfJwVsg.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-uVDh8pvQQs56hO8yf5xz7Z5nVAo84SqJauDB5ANudV8.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1mc-rbh_zA73BkM3NqhCDU6WmbZ1ssaTVhtCxao4rJ0.jar
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 04195b9a11bc6b4afb18554514516387d16d2121cc710331485ca37a948f8f20> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BBlbmhG8a0r7GFVFFFFjh9FtISHMcQMxSFyjepSPjyA.pb
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_23_45_16-13117806937219269478?project=apache-beam-testing
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_23_45_16-13117806937219269478
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_23_45_16-13117806937219269478
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T06:45:16.818Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:24.680Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.404Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.442Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.480Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.561Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.591Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.621Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.658Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:26.172Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:26.233Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:53.746Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T06:45:59.992Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:11.206Z: Workers have started successfully.
    Sep 12, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:11.277Z: Workers have started successfully.
    Sep 12, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:40.658Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:40.809Z: Cleaning up.
    Sep 12, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:40.892Z: Stopping worker pool...
    Sep 12, 2020 6:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:47:31.621Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:47:31.668Z: Worker pool stopped.
    Sep 12, 2020 6:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_23_45_16-13117806937219269478 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 27727e58-9757-4287-bf15-1f91fe943b0d and timestamp: 2020-09-12T06:47:40.597000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.907

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:47:41 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 37.963 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 24s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/x2m3zlfcocgeq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #984

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/984/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Add support for bundle finalization to DoFnOperator for

[Luke Cwik] [BEAM-10670] Make Flink to be opt-out for SplittableDoFn powering the

[Luke Cwik] fixup! Fix translation issue where we should never be translating

[Luke Cwik] fixup! Address PR comments

[Robert Bradshaw] [BEAM-10833] Fix type inference for BUILD_MAP.

[Robert Bradshaw] Guard test for Python 3.

[Robert Bradshaw] [BEAM-9547] A couple trivial but common dataframe methods.

[Robert Bradshaw] [BEAM-9561] Improve WontImplement reporting.

[noreply] [BEAM-10876] Fix TypeError in dataflow_metrics when distribution sum …

[noreply] [BEAM-7523] Enable KafkaCSVTableIT using KafkaContainer (#12826)

[noreply] Deprecate obsolete CombineFn.add_inputs. (#12802)

[noreply] [BEAM-10678] Split up assertion for clarity (#12828)


------------------------------------------
[...truncated 291.49 KB...]
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-LJ_9kaheq7w9BOODE3vO_VHU4msmNF498-oDmZDMkqA.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6336642198870915392.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0qkYVZuiOlYMEtwEO0WgTBHUQd_o9bExGhX5sX8cLWM.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZkOt_IZ69LrXxGdhYGjGLrdcSIfCGBMwsIwuv07kBbU.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KARW6fBHPnRXIlLEd1wYWPhWs8dOGcTVNyBpb7Olmpg.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-gFtayNgLjmvfoDs5w7U7gWnXFnjPbZIHw40GPxy1pmo.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Yah8QZgL716wAz5TDj42xDmIxYFqCBbI5ktW3aXI9jQ.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-PwVMF9y-83UPxbi3SGymGSCsbwgGE9wFP1y__NhJ6IY.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AppUfyD0dm2Crnbouj_ANpSfXmKQY3cmvkYyhZ_yDJ8.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-w3bl0P80QNKqfrsTJrr9YQxALjvB16ioQ7efG3dRbMQ.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-LJ_9kaheq7w9BOODE3vO_VHU4msmNF498-oDmZDMkqA.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-8ybF8Z2Kv0piFCjVRYEi0O-nx5tQRn5WlRb-r_FtGkk.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-LDP9Wiqq-yvlrW980XUcNhM5-Ucyk_IATH_OalqaIRU.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bXbp07XzKUhn9HH81b9PQX7yr9vOVBMbHlR2kmIklns.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-3cYzjeOQ_FTy9lWrVVhsUAfry40ywUvdLlu-I25I768.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-_FwKQoGi2je8cUzB32PXtati5wiu7yFcTOk1gmbLrjY.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-1gXPKzTmtLLh3xWnFF8K43hCudpDH8uEyXgcbP9JnNg.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VYxCfL55YVnvvW-Y6eUUU6xN7fKnI22QJfSWDdNyOeE.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-caeOSfNzkencZTebL_rnTCftyff8UIgbZqL8AwZpZzo.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bYQ7mAF07g8pbHJF5VP_e7DzCUgls5_IlXn4raHwgrM.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-OJjBUn3KZdbrZr1cVWZlWS73xTiP1qYEf1k1Z_n6Sew.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-is2k5DpHDpuxSoFkASi5wiuKGCqZaYrDdYtBzm8Oq1E.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-mRNy2Ezainit0Ls36xdV1-UL55iTgaMp2SRXBPMGpLg.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5iIRUowILCRQUCjs2IEmm7SAE8-tega96LAXdl_gcEk.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-WHwUIp9z6M5IIVNkZgY1PX-lJ5qugqD-M-eZlR3_mqI.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-c0wWcod0vRgzv4gonS8cX2J3KLbidJeKWHD6ctd6mRk.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-0eQpnwUWdKoG1qYl1EOKTFpBw0y3hDbnNe9b0AQROi8.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-uiXwkx4V_Q2jcu7W4v7IjNKN27dlUY-Bkb82SXlgnUI.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-usT10i7v1Au91jCtFD62bcTFrEbEyb6L91RpWqecJuo.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Cl5kVpVBXNyWw__iSK5-awFBp15GhzWUhFNNLKGg_PQ.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.14.3/71fc82ba663f469447a19434e7db90f3a872753/testcontainers-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.14.3-pumfUOVGxIRJ3HrNRtmRWTZcp5hY59_klYGQF6d6puQ.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-ejCs8snBUVAmIW8mhAibc780K0D-vyOTVHe_5fv2Q48.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.14.3/b90885e30e86eb454e7b0e8e580cf59616e9de39/kafka-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.14.3-ITKoa6D-lUtLem0itUO_R5B5PqtFpvvOjJf9bTByVUk.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth/tcp-unix-socket-proxy/1.0.2/cf53989130986c60113032e25185f4496ffbc186/tcp-unix-socket-proxy-1.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tcp-unix-socket-proxy-1.0.2-KCMDGSy00Z-cM3qmB0jDR5sQSJV17Q8Xofxkoxij7SM.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-uxLECycFhDCOyFLuHXIWbZHuOMSnqDwqRC5PiRbKw7o.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-common/2.0.4/b4d1870bf903412533e0b79c6fcd402defcfc05b/junixsocket-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-common-2.0.4-r8N2Fez3-t_3TSmvtEP-T2M9OWZG2J2CXoIkoneDn2A.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-native-common/2.0.4/726bd66a934dea39c817382986496fa4eda96411/junixsocket-native-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-native-common-2.0.4-92O4XsFT2VMJB0dOfyBspSsocDfXBLrO3aON1cTQ9gw.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/5.5.0/af38e7c4d0fc73c23ecd785443705bfdee5b90bf/jna-platform-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-5.5.0-JNgWIfgqwp_N2adBFgMfWQeiNDFY5hb0Vzu_okNK4NU.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.scijava/native-lib-loader/2.0.2/1451fa03954c5e31a358b411147de472b4dab92c/native-lib-loader-2.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/native-lib-loader-2.0.2-5WfHHp8_9T94vVj9a6bUcc4x4SY_XofR4fzF0-2h4kg.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.jetbrains/annotations/19.0.0/efbff6752f67a7c9de3e4251c086a88e23591dfd/annotations-19.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/annotations-19.0.0-Ev8B7q8MCcamjy7AJLO_n6TK1uaLdLlov2LH91kEcDI.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 178 files cached, 41 files newly uploaded in 1 seconds
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash f35bf4a5a513c0bccb385ff729126089f5ac94a4f5d2ed9e647b74ed6a729e1d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-81v0paUTwLzLOF_3KRJgifWslKT10u2eZHt07Wpynh0.pb
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_17_45_39-8814380002563364598?project=apache-beam-testing
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_17_45_39-8814380002563364598
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_17_45_39-8814380002563364598
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T00:45:39.602Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:50.578Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.262Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.295Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.323Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.398Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.425Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.461Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.495Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.971Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:53.036Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T00:45:58.703Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:46:24.546Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:46:45.513Z: Workers have started successfully.
    Sep 12, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:46:45.546Z: Workers have started successfully.
    Sep 12, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:47:20.484Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:47:20.644Z: Cleaning up.
    Sep 12, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:47:20.733Z: Stopping worker pool...
    Sep 12, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:48:13.271Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:48:13.320Z: Worker pool stopped.
    Sep 12, 2020 12:48:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_17_45_39-8814380002563364598 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5a924cd9-1e61-435d-b54f-0b537a85a72a and timestamp: 2020-09-12T00:48:22.838000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.322

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:48:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 57.425 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 6s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/ugihktdnc6lnc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #983

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/983/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10874] Support v2 Go protos. (#12816)

[noreply] [BEAM-10824] [BEAM-7654] Use deterministic hash functions in

[noreply] [BEAM-9561] Fix issue with pickling in doctests. (#12560)

[Alan Myrvold] [BEAM-10868] Fix build (docker run) concurrency issue when building

[noreply] [BEAM-10009] Add beam:logical_type:micros_instant:v1 (#12764)


------------------------------------------
[...truncated 287.15 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gMchEEhbxaRrNf1P5xcPsICu4oMIsJ-XW2cmgKwFatc.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-eBadOFA0fqv8G9L2Wd3Um477d0ki09Ob4UJqszX16OQ.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gMchEEhbxaRrNf1P5xcPsICu4oMIsJ-XW2cmgKwFatc.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-t2eNpXQCgZwI5KgYP5VZf3jqeaBJjebeqfwEpjC0-jY.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4AO1reP-sV1oEAvNZb758j2Uhv9BqgRWjDCZNEgm3lA.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Q6quusJltf6lfv5SsIXp0JCi1Sgq-MTAeQfqlrdsaYs.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-peZ5SYTL3Hqnq6vbGTtTswa0JiWBPJyU0FH_zPWcwq8.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-o4iBu5jmn2LrPwi_ds7QAXQ-pyvqAMd-BZWJ72s9u0k.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-fJSqknmdlL2NoQzJjJbaXLf7IeBLX9wg4vPX9PQjM2U.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-W341Jh_3oED0XPXotfCXz2afWgRCJNAYLdvMEtt7WyA.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-9klyhcYbdUxdyuDJqnl9gcRjoINslbLG1r4KnXqvCno.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8867722970680486480.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7L3fLHt-HsqBCCDsrWHqFbTpom5tkfMcnG05HXfv_9A.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-k1ojP9Y9cNG3_qpXqEftM3hDyX6RDoXII_ebD-UO790.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-vHM22C2ymmgYRo-CAFvdehkmkbG3I12as-CfyYuKobg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-uahFEsj4yV8_ydh1Cee6DLiXAJ7CxtvsLjfLjwSFofg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-5b9QI-W_e5BUDfLTwbh_64SDt5e_cXZfe-S9Pm6tRQg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-g-JggqFeTaBMk9G6BOirjczISokGKmUX6Tpt35ACpXM.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-gLY7oc72_7_e5LRSMleAIVz_bSsa2IkMkxjiDdWciMs.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-M_A8Uc7DoqhCX1DswcSeVWLOdoVZuCNKuW6dEJvjwS0.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-iS3WTp25igCIdylCP65sZBnSutjSymp8YHoJN2k-1mk.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CwlG70WDekWDWbbxkYBZ0rP_ZNFZN12C1VoDwkjh--Y.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-d5RbX3wLHAUPtPoCPgpISx8MMy5RVnWPR4XnqQDoKnM.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-8cn525-een-8dZi9RpwqimzKZlmsGH411tQ1zTZ0cA4.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-gg6Iy6lgyZu9L96cNue42RURw2ZUZgRBa_wA515Kpeg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-GeO4IwTNrKdurrbYGb1rK1ODkEnDl77ZLygEVYK4qhQ.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QVlKnfLaL6006khliamBpKj2KquSuLttRKvC4bUc9T0.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-yvK_oVVhcbLLQLH2R2tP_8SaMZ8WJdEFhc8MHpgekRY.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-IOsh7pW4-j0MYApu8-RZVQ9wzIMv7HYIPrw_1NgLOso.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-_6JpkNEB7vq0PjxapkCYFRoQohrt7PfzoSI39QXu1xU.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Ixbo4tTmPCbm93rknuAmlW9wm20f12iYregH16UFDY4.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-V80DpgZ5XK-361QcHpIMn6ZQ0TegcVhn-kYG1H3EA9g.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash 24c825147da13f35d03c06abe778da2e646a36d43a6e2bf4065954e68937694e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JMglFH2hPzXQPAar53jaLmRqNtQ6biv0BllU5ok3aU4.pb
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_11_45_36-17339166121582903736?project=apache-beam-testing
    Sep 11, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_11_45_36-17339166121582903736
    Sep 11, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_11_45_36-17339166121582903736
    Sep 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T18:45:36.344Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:43.383Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.153Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.196Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.237Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.314Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.354Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.389Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.423Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.852Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:45.039Z: Starting 5 workers in us-central1-b...
    Sep 11, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:46:16.978Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 6:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T18:46:20.825Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:46:39.841Z: Workers have started successfully.
    Sep 11, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:46:39.876Z: Workers have started successfully.
    Sep 11, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:08.746Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:08.901Z: Cleaning up.
    Sep 11, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:08.964Z: Stopping worker pool...
    Sep 11, 2020 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:59.014Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:59.057Z: Worker pool stopped.
    Sep 11, 2020 6:48:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_11_45_36-17339166121582903736 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 16ca54f6-27fe-44fd-8fea-e1a574fc4229 and timestamp: 2020-09-11T18:48:07.108000000Z:
                     Metric:                    Value:
                   read_time                    11.094
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:48:07 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.079 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 77 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/ezh4jsl4ksqh4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #982

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/982/display/redirect?page=changes>

Changes:

[nosacky] Update Gradle Wrapper

[noreply] Merge pull request #12435: [BEAM-10616] Added Python Pardo load tests


------------------------------------------
[...truncated 280.27 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ae-f9Sheu2wbkzVMjhU3yuMeD46q9B4AN572q3FqU3Q.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RrpzuKjX6tdnEEgoeH-Y52Xbsx-g9VWh8VwSDS1G8nE.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Bqz0OrBTg1-gzj5rOdsvKvnbLIP3vFQZlqKf_aYtCEQ.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-eoARtvQ8xD38IYRnoQBQHk8gf9k7-LxlBbVK_qALrA4.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-i49DbrbEN_EF0EJNQGKAEkAP-2DgEBbBYFvGFhnuGmc.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-iRfh7N3OmNuIscFuMVAo2XpwtYnbhkpWs_0lFeFzVa0.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-qdrya5fq-KK9J1c0WClyeFwpFh5kbWPpxMpZsRx9MeU.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-18vI8T7FjKxmh898q8-a19MGtx-Mb9h6dgQPbGFfTYQ.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-SYDCsTRSert-DvrPCndDb-DBqRJy2H1nEZ6JoZ7ck04.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-JtWLeDuaTM_Xyx3rJIDqNTXdFBwta1DC1AjUAjX3uzM.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-6c7D4pNIrSoBOe-vtmBA8XJa2yK0z6A8dniKh06JpTs.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-IHIYYVFU0mFc46CJId0kRLNSlOAzqd3sXyHpKxKY7OA.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-5icT37eqIee1jpf9FJtDGlCIu9kupnL7Yh7wt0UAWJc.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-ZnDzBdSsnCzj6S0ZGkgpfLO7jplMuJcfo8ASiwHiRaE.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ae-f9Sheu2wbkzVMjhU3yuMeD46q9B4AN572q3FqU3Q.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test96809853339896008.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-awOOZ3_qju91NsiTwp5S6jydrmnTI8xkd0E-ROEMQdk.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-hZGsZpUChD-CIcnQb1C5bI2puzzSbFPl-oxm5kbm3-A.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-U_764p8r1QpTuCFYfY-lEtxl9ZJSjXSViVgqhb3j-0E.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5VSasBxezAIjnogLb9FGuWNLDrb1RhgpybsaaaaiEoU.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Hp4aPqnUT0FmswW5kyNUbhCSpw_-6GruqfvrZ3F1zAI.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests--74lxoyOIQi0xm0vs8Mqqna1RZRCRAY4AMkek0sNQe8.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-fG-93-yQPPNxI__sZBQWo6oIGM2hnAJ-yDs17nH4WUE.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_ONWD1WHrJGCa8v-wOw6rDV5gyfCEUjIG9tcN-iyStI.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-WvxaJUzkqosZB2Y07f1-xRxpoafAG9-A7ZETM0fJ9tM.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-I5GaLJpazUJNu6niTzSWPKkGLOctdKMAj5AtQfQJMrc.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-sTKuC6BPGUoPoyHrvkMeUfBIeX8ldpUVRQRClk5K0T0.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-VxCGu6AS8WR3MJNSwvredsACzjgdFr7b_DBu5PrhCL8.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-gQSSwfJC8LnNIZYVahNDbAb6k8MUxy7p_GYamL1D_94.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-FBptW6-9s0jhAjlmzeUzQZQ8sxKAz6ZQpMnqakAZmig.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-xhtyF5wNVQbolMX5Ffp-wnPRKQVOjeHBt5d7r2rBTTM.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-LNGEc6ISGTzpKpyUMekIeV4v2zcZ7272xIymG_DX_0w.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92069 bytes, hash 885bf7ae048fdf391c390a6e1c3eb413159eb05f7184af122967ae716024eb0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-iFv3rgSP3zkcOQpuHD60ExWesF9xhK8SKWeucWAk6w8.pb
    Sep 11, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_05_45_22-4646391347541778270?project=apache-beam-testing
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_05_45_22-4646391347541778270
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_05_45_22-4646391347541778270
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T12:45:22.136Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:28.991Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.740Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.769Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.797Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.877Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.916Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.945Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:30.091Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:30.444Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:30.534Z: Starting 5 workers in us-central1-b...
    Sep 11, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T12:45:37.267Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:00.777Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:00.807Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 11, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:06.128Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:18.999Z: Workers have started successfully.
    Sep 11, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:19.035Z: Workers have started successfully.
    Sep 11, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:50.387Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:50.566Z: Cleaning up.
    Sep 11, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:50.849Z: Stopping worker pool...
    Sep 11, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:47:42.221Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:47:42.256Z: Worker pool stopped.
    Sep 11, 2020 12:47:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_05_45_22-4646391347541778270 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7725d500-0797-4593-b222-85df5960dc65 and timestamp: 2020-09-11T12:47:51.387000000Z:
                     Metric:                    Value:
                   read_time                    12.158
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.064 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 43.268 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/p4pqkcuacvoq4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #981

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/981/display/redirect?page=changes>

Changes:

[noreply] [Beam-9543] support MATCH_RECOGNIZE with NFA (#12532)


------------------------------------------
[...truncated 282.24 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 6:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_9Cfd3ryUawBsh3msigNEAa8Cvi10V8wJrscUSfwYQk.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-grRhjzORfFSbq5pz6zhNt5b2zFoC5xK0P25yj88INi4.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--B1zo9QNCliMwvbRjzBWmZRL2WaCrLNdpNruYKFDW0U.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-qFu9OuvheSS45-P5pFZDFRzsUMxjS4jsPVT6h9IMwRI.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Xo8iPnbB0F48b7d8k5TCJY0LYhj1tkpjb9MQIrWmDo8.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MMZ7bA7QAggCus1hdIoSgun4WOFH_aVMR_nomKnRMzA.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8133321743153480524.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-eWCyglhyGERIMtTGYsGbjobXe4PuGxONbUcBkjmTxWk.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-BQMhGqNFQSEIZyOtbCLr8FFLErGS8N8h2RBbpS7wUkU.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qSUTeufk8e_I0zfQt6OIgg7OyRNGmQXy4rVMCDPdE8E.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-g1FKzvEu1xwmB9oXVghpNw1dszE7Am7GaDmTn71Wdks.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-rVcQRS8_RbiOZDyRiGzdH8WyH24-Q3aJsSXjJsLr69M.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-aJvON1v8KfCMbwy14yBZocMEIYGJTrQRaRRgr9V4KoI.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-PqZ8lMBtgykrlw5u32ERpD-IuuE-7AqAtseeJFycbuY.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-B1vLv0BHOGojq8Q-aqTl8_YvOvT_uyq3f-BxH0qkUNA.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-voDwHFf0uMJRnjtL6iubdx6UEiFTHsAeGGDS17CAbS4.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-QWSFan_77uW0nUqBzH1nE4KidAvITMozko0nFMPeEF8.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_9Cfd3ryUawBsh3msigNEAa8Cvi10V8wJrscUSfwYQk.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-e7OaK-HFcOtYzmZASvl7F31XOyHc7Vqdv_qEya_MJoU.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-7r30WNhrp1uxUAHK_8FKQvcTpO-JHVePCz3526fhBB0.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-J6l-1m96I4bW7SucJlEGKAgkfaEFxzELKLtty4jOpuo.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-0nS4qFQUaurprUnpqG_t6WheA5bwqMU9SOFtdmBmX_M.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-k1AjSlcJdAOSCU_fcTWY9DGSGOTZYFrD7qI064FaVnc.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kI_qa3dHFGQ8tV97GravtY94gyuZoseTDFlQXK28f4Y.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-1OoNaZIcuiXvHbMTrSPM3s2vdPdGBEyzV5zBbYKVRMY.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-CNlg06FxFftf-dth9vVu2hLm3ip1JbTO0Z7jCosjTHc.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-UFA5E5gbUTTAXExYB45VUPpPE1lLdrHkQZDQ5yKqTsw.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-jk9kJjTYu-BV-iwb-DGQ9lZy5LSsqyA4vbTHHPIsJvY.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-MDP9QiMokdVITk3kiybeLrH9w8jirlF_JzZXe89Sg_k.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-glqEdPkCBXH7kz2UEFK_58-RFi2Thk5WLfoOhLRML98.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-56cOC1vWx9OMJ9lnBwTIqk3Es9SfdCF71iEKSWpUYyg.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-W1nt0iES5_HP-DyNhfW_FcLK0iDEy0mEBbVPCK45ccY.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash df41dfea354c41904b5d163e0d7232825928225c7179d2ed2fabf9a25bfbc645> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-30Hf6jVMQZBLXRY-DXIyglkoIlxxedLtL6v5olv7xkU.pb
    Sep 11, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_23_45_35-7573820781821908256?project=apache-beam-testing
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_23_45_35-7573820781821908256
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_23_45_35-7573820781821908256
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T06:45:35.112Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:42.759Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.639Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.671Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.834Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.904Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.935Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.965Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.997Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:44.460Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:44.540Z: Starting 5 workers in us-central1-b...
    Sep 11, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:46:16.785Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T06:46:17.899Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:46:41.289Z: Workers have started successfully.
    Sep 11, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:46:41.318Z: Workers have started successfully.
    Sep 11, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:11.874Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:12.029Z: Cleaning up.
    Sep 11, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:12.116Z: Stopping worker pool...
    Sep 11, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:53.710Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:53.843Z: Worker pool stopped.
    Sep 11, 2020 6:48:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_23_45_35-7573820781821908256 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a317ed69-a16d-4d36-becd-a847dbd0b38d and timestamp: 2020-09-11T06:48:02.281000000Z:
                     Metric:                    Value:
                   read_time                    12.324
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:48:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 40.082 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/zws2rkqjjchqi

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #980

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/980/display/redirect?page=changes>

Changes:

[Udi Meiri] [BEAM-10701] Fix Python coverage reporting

[Kyle Weaver] [BEAM-10762] Fix artifact staging bug in Flink/Spark uber jar runners.


------------------------------------------
[...truncated 281.52 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ieC58hUm98_y-GGagrXdEkZqtvQ2l5YRZRLzJCrmEQE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-yNiqRMgHr6NJXv_AVCStjnG2DQz9nTOWMVgwAh4bEd4.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-MwSOUj-MyNtQWa-5aX0riw_6W1QdXWnC4CEDKD4ATfs.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-hFo60eUWGUA-bJYW_peVkOhDLNS3-p0RnHX8OLZlzKg.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-F4xeCjVy7_-EVdlzZ7E9NfbBStEpdqBQY9NfwaoAr9w.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-XWvWQjYfRx9hpeQq3OLzXMxFqEna7qj5yn8PgPceYXk.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-swXMvVBmRN5BhKpVo2PipiNsvEtJCdfzQZ9NdiuXMZE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-O_LNUBkpcdJR4kPf4jmFBQTg9hAVP7FcSfEGbZhnCaU.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cOzH1nlhBvi_4eJGfWdP3OIWpAzJBorGXRzpHJm59z8.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-2gV0dJrWb0Vjd0fall-eH5VnwRbOtEDyprMPf4dLg7c.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4667017373377776618.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KZTFHs5qPcwf4iNogdV2cDlXWv6eicRxr_KM8KxFEsg.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wMxMW2gtNf-YiJLjoRnbobjDnrJJjivBF6xvFECPxd8.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-2E95Q_9wp5aRg4K7l92AnMVk_pxqhi0BEUqr_MAdvBU.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ieC58hUm98_y-GGagrXdEkZqtvQ2l5YRZRLzJCrmEQE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-eUtrGBQNP1KLDz9z0O3L0PYQCWaKv28cQLf6MuXETJE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HHYUr_S25aLAbLc9MfF1RENP1rUfZ4j0ne9pwqUaghM.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-wvnnRz1CeTGtTHOJN0PafoY5P3kiE3gxpDukjIYZ_ts.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-2Yxux775geaF8obh0MKUaRGlds8vwUbkqEUYqpVU5P0.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-BYxr2siFjGR85CyNdPlR7nOtu2B_a8HzM-D8y56vqr8.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-pktgizmo61FVknEbPbmt_E7NHdKoEPmpkJ8z4UUkZ_c.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-8PX9jEMWSTaJRHTSAdbMiW21wQEBDXubTB1MJG8WNps.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-TSNfr4W1eOjvUri3qrIjTZZ167t4qz3N2NEW2A5kQz4.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Gm5KVx8ok8uraHFXhVclw1cWP-yKKUnhkdzZfHT6bYQ.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JPL5EzLUoK4qUZBSPW2agNWlMCcB-Q6_J7JLco5PzzM.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-728kO6QeW-vGeZ6lDMCTbIFNlgA5TldktwRhMv7bjDA.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-V5ouJnbHDDwOraZG_5sxKo4XmE46gpNetsSsgTPFYCY.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-367uDZRB-uVwnqegAZpwBS-vnQLcAQ6Xerne7kdBMNE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-IxjW3mW5s8RqCzUwpcbuZdDj4UUoLDG9bHD2WB9FgdM.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-M7qFJwEJMizpw1K05G4xxMlimiXhKp9KjfBdgscvn5c.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-n0qyaDhmefLsQ4aPvfbvdqTQjE-N0UDMO1mCqDMr7Uk.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-MGBYUlK7mDrI7CKxMqI2iAoQhZ2vAZAPvQV-JCNsGDI.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash c3e0b58032478fda0d5d96aab7010c2192346a57bd0305fcd2e2712d2690e043> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w-C1gDJHj9oNXZaqtwEMIZI0ale9AwX80uJxLSaQ4EM.pb
    Sep 11, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_17_45_33-5855299628727253134?project=apache-beam-testing
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_17_45_33-5855299628727253134
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_17_45_33-5855299628727253134
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T00:45:33.231Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:41.150Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.268Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.295Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.321Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.398Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.428Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.464Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.498Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:43.092Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:43.182Z: Starting 5 workers in us-central1-a...
    Sep 11, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T00:46:10.074Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:11.170Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:26.384Z: Workers have started successfully.
    Sep 11, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:26.418Z: Workers have started successfully.
    Sep 11, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:58.398Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:58.540Z: Cleaning up.
    Sep 11, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:58.632Z: Stopping worker pool...
    Sep 11, 2020 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:47:47.683Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:47:47.723Z: Worker pool stopped.
    Sep 11, 2020 12:48:04 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_17_45_33-5855299628727253134 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 771ca346-c817-41c0-a46f-1b27e15250a3 and timestamp: 2020-09-11T00:48:04.489000000Z:
                     Metric:                    Value:
                   read_time                    13.742
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.035 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/frxymp36dbfhe

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #979

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/979/display/redirect?page=changes>

Changes:

[txian] Upgrade cloud spanner version to 1.59.0

[txian] Upgrade gax_version to 1.57.1

[txian] Upgrade google_cloud_core_version and google_cloud_bigtable_client_core

[txian] Bump gax_httpjson to 0.74.1

[txian] Bump com.google.cloud:google-cloud-core-http to 1.93.7

[txian] Revert version bump of google-cloud-core-http

[txian] Revert "Bump gax_httpjson to 0.74.1"

[txian] Update bigtable-client-core to 1.14.1 (released version)

[txian] Update bigtable-client-core to 1.15.0

[txian] Bump google_cloud_bigtable_client_core to 1.15.1 (local build)

[txian] Add local maven repo

[txian] Revert "Add local maven repo"

[txian] Add local maven repo again

[txian] Update bigtable-client-core to 1.15.1-SNAPSHOT

[txian] Update bigtable-client-core to 1.16.0,

[txian] Revert change of adding mavenLocal

[txian] Fix test failure of GcpApiSurfaceTest

[txian] Speficically call out exposed classes (instead of an entire package) in

[txian] Revert change of SpannerSchema and will make it a separate PR.

[txian] Add a separate blank line.

[kamil.galuszka] [BEAM-9456] Upgrade Gradle to 6.6.1 (smaller version of #12568)

[txian] Add TODO in GcpApiSurfaceTest: TODO: remove newly-exposed clasess once

[Robert Bradshaw] [BEAM-8893] Fix issues with state and multiple workers.

[aromanenko.dev] Update doc URLs for RabbitMqIO and KuduIO

[srohde] Fix data races in BCJ and RecordingManager

[noreply] [BEAM-5757] Updates CHANGES.md for ElasticsearchIO delete function

[noreply] [BEAM-10701] Change coveralls badge to codecov (#12768)


------------------------------------------
[...truncated 293.02 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 6:45:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-0WEIyarDFMQOgVVqWiYp2F7VpnG6Fxii480LAFon9mU.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gfRbazVn-8pD5Npk_tABntlQaUktjQvp8WUW8_Qjjbo.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-fO1DONJRbRbyiwfU-ohklP63dpjdzBT3E9JKgwfUTwY.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-U_aL5a9pcLxwBxCREhg6dHkIraZJd5OxEJ7qw-KLY7I.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IPIaEyYFK6O7F8xDFkCbE1S25Ww2Qdok0JhAQXquA80.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uN-VGOlSj4l97hwBu4uOy-MerHzdglcWK2eTk1FOhrc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-pE_pWjWLYj3EAJHbiNa6kTqyoYbrfJUq06BL_02cnWk.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Hat2RYrzChC-8lnD9969Q0f9QUDSkZjQzbhOKXSRWg0.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xPFZ_ahYOGy-T3Go11fZpKbgX0g01DDIoCD70YghAhs.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-ygDQSgSv_fEXzmjtOpXEFGZh6oC0KUge-g8RT3U9cwM.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ylu8RvH5CDVyFdfyeCOD_XeCxFA7uTuyuKu-hqv6WR8.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tcZLEJWJ5NiIi4Pz5H2ASgIy0iSZedhUho9VvXc5Blc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-5dGQdxbOtnakQaJMEL3rGaRyi0Ntqsxu5qaT05shMsI.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ylu8RvH5CDVyFdfyeCOD_XeCxFA7uTuyuKu-hqv6WR8.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rPqo6N2YwR3JKE5B6TUi9waW4JA4xK6qXsSbAIDhe0o.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ynPBJ0d9WgCfCO3DRzpylxXRd5Ujt2w4XE61yWv0ul8.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-n-0TkJ8gzoIyJ-kuj-FchiKwG66axH6rlrHFc9hPrwc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-rGsXwuaE0BG1L_4H1OjUV-S8Hqf1lTjb_M_diNuiC4U.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-7Sru_uvgEAv2Yu8KdLVrGIrUzPk93kIhZx8po7TDQbw.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-SSruovCV-lMDia47TS8oZa7Qfv4b_pUHoY2AIpsnhSc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-vEibZ0ZgfELaCNCvSXRfMEBEwSmbdQbVgqwoBVSxP0M.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MPPpUxW8WFCdmfCcLuPvITI9frmj1yYEFLVVrhL2j1U.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-lmSjZ4viXpD9qCY1Fyz4Ivwgn802NllhV9CVRLN73So.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-AKOLNVtosPXnO0VcBFW7It83AfSQZbJ04q__KztCaSA.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7050410037367012818.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SnNqWqOTf7Q4o14QYmAI_Z548PPCympo5hLrF0cT4cw.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-UewxIgW1qm6Tq400fgPfHX3iiRcb-dALAauphIroAJY.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CO2HfwqpydQG9WVPMoz-DTZRTw0A_M8OcVeBjfFm2aQ.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3EI3Sob0z1WAvwuii8FqDa804AIbLNjb2-h7Kwa1_NA.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-RbG2cOYr1dH30GOMesOjDuopIWvjSBWMQWpoHZCzq4Y.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-3X09vT6ccpFBGNEfZzA86ceey6AFhA0QDSQKUHBJVXo.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-tuMN9Na9L0jT0Ga4tY937Zy_HplZ99dDuLW5PRv3WFo.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-BTxzgQLf2hcY68KBzzf4MkdYpk9m5jd-kShxDoBugS4.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 179 files cached, 31 files newly uploaded in 1 seconds
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash 2e5a71d66a6aa763115b7acb4e8a3c04ec196f4f124280d7b6eba07a1451ca0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Llpx1mpqp2MRW3rLToo8BOwZb08SQoDXtuugehRRyg8.pb
    Sep 10, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_11_46_02-10229227285528791390?project=apache-beam-testing
    Sep 10, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_11_46_02-10229227285528791390
    Sep 10, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_11_46_02-10229227285528791390
    Sep 10, 2020 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T18:46:02.569Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.077Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.711Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.787Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.832Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.972Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.008Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.035Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.058Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.433Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.509Z: Starting 5 workers in us-central1-b...
    Sep 10, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T18:46:32.552Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:38.893Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:01.737Z: Workers have started successfully.
    Sep 10, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:01.784Z: Workers have started successfully.
    Sep 10, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:44.302Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:44.474Z: Cleaning up.
    Sep 10, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:44.579Z: Stopping worker pool...
    Sep 10, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:48:28.982Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:48:29.051Z: Worker pool stopped.
    Sep 10, 2020 6:48:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_11_46_02-10229227285528791390 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): be1e99a3-ff11-4186-8397-1f6fa5c8a672 and timestamp: 2020-09-10T18:48:37.483000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    22.518

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:48:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 50.05 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 19s
107 actionable tasks: 80 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/bjeuvz7hzlypw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #978

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/978/display/redirect>

Changes:


------------------------------------------
[...truncated 294.72 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bKZ--vXSc_lpMwP_g_ibuOP96hTvylQGxzw4BBNsPvw.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-swO3_W5btQ8suCbZYkThwgu0txNgKQM4ECIrG95LA4s.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-UreiqRyhKLY7vrfYm8oV0jM5QnqkcZTunFj8nBm4mAU.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-OR0gg3mlpnmWmRvykPVolmpE8fwCGPzuy4MRKyr3xhY.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-v5ii0caRXVPW7EAh0fi55E_xfJGJu-WmYM631f7_RuQ.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OEVeHonuuR3_dEW6FEgpmnW8a75vF1JGa8D0FDQzQSo.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-GJlgYeu-qEsAHqNOwP3YCvWSLt5IVO3dbx7jeRtc8A0.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bKZ--vXSc_lpMwP_g_ibuOP96hTvylQGxzw4BBNsPvw.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-g3ldI_8PO0lhdcrRgqY_gF2AgLcXARYUHpt8und40BI.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-2EobleMsF58lRm_tPeaOtdwkz-wN2J_6Xkaq26-ZYCQ.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5582218162116404072.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NaUE-U4M601QqY3P-P-wZA6P2I4sS17ak9xM7O6P6J8.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-cvLBlzfp9E96om3fUdTp88e2Lyfc8VP9vycPs3JpBNk.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-AULZiL_TnR2690SPT5Br30vI59qQo6xRgw7h7dBZxYY.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-63fraZHs3F9MQL3Goevb90cxMTT9-XNw66w0kZYdEis.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-kloqnlk8DJw3AtP9UVjT9bxRazvoR6acDXUJ7gguNkQ.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-VBoLJtMq3RO8Q2ZjztdM4VaWv6ls7f3ozdKwCXHBR-w.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4SpeeWzfZb5G66MqEQsxVa9da1wYpyc0-N2Dhz9SKkw.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-8T72Uemx6Cr8OrVOD_9H51CoRXxeal-oyru9wLEwmWg.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Gvrgu-fsEfrMvrhn-avn7J3QvyXnEBpi9tF9TzufH8Q.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-l7lpoFFfLLRfZ6xzN-IS2bF7byLWCqlJ8cTLAaBKJ4Q.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-P9IGvbM75a99kwZ5-UCdRg93org2WxP0yT_7Fm9TntE.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-eUqCPlmWCTlHPBTg3JBVfqdyylgej0SCEUNZS-ObtUM.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-CnpGWYCWFQTwWeNq517iCyax0joMgujeqWjauxQfVtU.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-hHx1F9nUjrVjnN31-uaAAshcdod8KtoQTwDvVoSDZp4.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-y2vrRA93gwyQ-Vu_Guu5lYSdX8dQ7DB7Rv8858aBwoA.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-uAFqddBaBNkgmGta92crO300A6OuhWkFRhUUYOM-hKM.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-v03n7tTZ7kClh7RleYFIBdZZwsFIi7zD4vrCafwJsIo.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-MDfiTdtL7SjiYUHrTCVOZWgYpRBLmOr2i7X2onACPJE.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-J7Ij4QfdtZHV45DkV-4MMwEIz4qztF3BTgKeNuxxVvc.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-d8_AcpCc2kCFIdyRVTBpvbf0J_D8Zp9jjqvbzevZVME.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-SyzhaIQjE_j0dsJBOD_DLfZDLu9n2dnBFvrYusY6mhI.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 40f7295ce05905faa5eef0d54dd460e1c87154aa1f6259eba9dd3fa23c157fc1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QPcpXOBZBfql7vDVTdRg4chxVKofYlnrqd0_ojwVf8E.pb
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_05_45_26-12099994824242108149?project=apache-beam-testing
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_05_45_26-12099994824242108149
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_05_45_26-12099994824242108149
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T12:45:26.409Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:40.695Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.493Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.525Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.560Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.637Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.672Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.703Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.736Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:42.070Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:42.142Z: Starting 5 workers in us-central1-b...
    Sep 10, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T12:45:58.756Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:46:13.615Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:46:30.505Z: Workers have started successfully.
    Sep 10, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:46:30.537Z: Workers have started successfully.
    Sep 10, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:01.845Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:01.993Z: Cleaning up.
    Sep 10, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:02.078Z: Stopping worker pool...
    Sep 10, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:42.965Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:43.015Z: Worker pool stopped.
    Sep 10, 2020 12:47:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_05_45_26-12099994824242108149 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0bbd0d36-716e-444d-b08f-7fd73f7923b1 and timestamp: 2020-09-10T12:47:51.630000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.822

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:47:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 38.068 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ev62opagvwbfk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #977

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/977/display/redirect>

Changes:


------------------------------------------
[...truncated 293.94 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 6:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-VUwTsVJXAT4v2dq-0zOGrWuNKE5mnFF2FSS7G3Ymx4c.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-g9tQO40nexa6SGT-EN2q87vMgKPDjjK5xw7Yfa1qd-w.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-otUD7SDNHrlrsu84tduhAo6PDhYIQFlFCyUSjedzKSQ.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-TQ_1B9nUnmepfkYMjVNpb7k_8AMKgWiRl-6CONdYfvQ.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-MvskQvZlm2NL0cNiy16NyJvnvptW1r-WA3LDD36MYw4.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-h7gsBqHgFnnbP6mBl1L_3PovR5Yap6H0Ix39QN17xUs.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-VUwTsVJXAT4v2dq-0zOGrWuNKE5mnFF2FSS7G3Ymx4c.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-TKt8XCp1ZTRsaAUxF7b_5g-rzUaMJsxYxAa6BsXxkbY.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-j_8ZBV8bKS3CEfff6Ox10UsJ12FERQYQlUfWHrG0ayc.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-FIdkaePSg-roiTaXbsRH-MpKNoOB0_8nHENQUip1Clg.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-bxlnRT6BDrgd-xNcg1_a4NOZqpWoR_iE65j23teR1tI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-LzfOsIPNbiDEYPh_r2cZXIeFpccsoL9M_Ux_Kp3xqwk.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-1DHU3p5MkVmBT3SYoLishgfhQ6OE2DfCJsuduWQ1wno.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-C5I0o9POTH0t04T8gxtB_y5TvW5GctbhUSmKc9pxNFY.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-SMu3P_It-g_M8f3ryIcpp_aACRFmbFChrsTxHeHGulI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-LvaGEtv1IEKQZ2IRX8Rb3bTTIYgR9Jcq-V_qrRnF_SI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-E1u7AJUQ_vagLBWNBR_bHUksG9y0LVS_RDXghCKEpOQ.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3742486861977036169.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OyhEayZlIwxObvSn9-UJQuvk6b8m3sh9v8cFa0t9604.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-pG3NxuhV4gJmXPcVCuz2MI_vmTPhpJK1pL9Vcg4AECs.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-tED6mS84D7PDSN3SoyjnCQ_0wHyAcbOFGN7uAIiwYzM.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-iE8fXSAGu8wnGp_X6lhLh_ElAjDcw7SvogT5Z11XkFI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-0VDBB7Bo4z_iS-RHiGkA3Y9h3oyvhH6V9VDX34M0aDg.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6wNlYc-PmzpPkVmTDb5Q6QyNRFmnz1MS7PJ64icTah0.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-zhyAYLZlprGeoXmLlbBLJzvFDv9CI66qUTRKiDJu2o8.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-5VVS2pl-8INmkFYLOwb7qmkMKcS_qq_myvqD0Uxt1ng.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-w8s5Yl0eAXowbxmLWOMWrPRFvTZ4pcMoFWeizEZ8dNE.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xriEgQ3AkVAGcZpNQ4tZ62Eze6u0541pew2flprBHNY.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-YlCjCKmIYmxM0-PnPN-rjddcxHp78DlAynfHhgXs2IE.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Tr0sMbKnk4DRhv35IgzCauLv3vCcEZWyZa9QW9HFj8E.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-eh4C51dSP31OOzxn8iE__H5GpL51JLTq3v4cc4ls89Q.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-iWCq_URT5Wh-2ziLlJ21iI6Mtn9nmOBY3gEc66O5mRI.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92055 bytes, hash ec6f43335c1f63ca7ec3760aa25c943fdf35ad62dfb807f5f677807d475f3f4e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7G9DM1wfY8p-w3YKolyUP981rWLfuAf19neAfUdfP04.pb
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_23_45_27-932784473032290779?project=apache-beam-testing
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_23_45_27-932784473032290779
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_23_45_27-932784473032290779
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T06:45:27.397Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.271Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.729Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.873Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.901Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.963Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.991Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.016Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.039Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.545Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.701Z: Starting 5 workers in us-central1-b...
    Sep 10, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T06:46:07.480Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:12.416Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:30.008Z: Workers have started successfully.
    Sep 10, 2020 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:30.045Z: Workers have started successfully.
    Sep 10, 2020 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:59.901Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:00.048Z: Cleaning up.
    Sep 10, 2020 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:00.125Z: Stopping worker pool...
    Sep 10, 2020 6:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:58.037Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 6:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:58.075Z: Worker pool stopped.
    Sep 10, 2020 6:48:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_23_45_27-932784473032290779 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 05114464-da91-4851-8612-09bd061edf28 and timestamp: 2020-09-10T06:48:06.535000000Z:
                     Metric:                    Value:
                   read_time                    14.395
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:48:06 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 51.714 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
106 actionable tasks: 71 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/kru3chqxxijj4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #976

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/976/display/redirect?page=changes>

Changes:

[jiadaixia] [BEAM-7925]add projection

[jiadaixia] [BEAM-7925]spotless

[jiadaixia] [BEAM-7925]add schema encoder

[jiadaixia] rename and remove duplicates

[jiadaixia] Modify description

[jiadaixia] Add description


------------------------------------------
[...truncated 295.56 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-R0-H23NttD1lPAAIRpqIiJpL40U6GGj0lSfajur1jaQ.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-T75vA3fLMFp7G0jIrNfQqhUQvDOfTvFv0BDTDAG17C4.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-R0-H23NttD1lPAAIRpqIiJpL40U6GGj0lSfajur1jaQ.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-P4LNbvUHXvH_iZU0pplMjxWHijL7okgvtKnowshP0WA.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RHvmE6xUnaovnW1T42KgnPS0Y8tiWYyisI5HjFMU8XA.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-iJbeuw0UXNG9gC10M9AdyU1ArtSeh-F5TkKwEL7vtcU.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-MyAirtn59KFSEpav4dCT9sXRMw1BcUOBastDQ6wUH2w.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Flmmcgx3NvWjHo1ncFCuua1Qsleh9RD1njS1xOrpL9k.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-T6Wxxk7EKhpJDYyh1aVHC5eO6_RRXVJnIgnVvOIEd8M.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-z0IRkf1abjjrMmlDuUFRw63fPwmzH5w9FkVKlx6SR1w.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-y6ldo4YBf6iqXGFpSYKsb7vaVWqNOpEtzP8rVLMBa68.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-jQv5gJYa36AzKM-p4slrvqJLLeIs4wf-MtX4LY4uBaY.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-e1MWu8BzLFZDkNKpS1uihYKPLsWOHydyvevez6rqC4U.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-cmJhTjHd2D6dz4jS8eTTv-jEdI_7-rAhbqPMHwBCrfQ.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-8Bu_ZBUw4ulNy4WI9pMjGI-RyQqCmnSjHQvr4fBnC-A.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-9c-oi-zBGRPJS-33q6oy_GYWKkQcnydCt7lCd7evIK0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-6mzkacQ_Ca8Sx12C76OERKZYZHqwI991viKoA3V5LF0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-FOcdLcq8PyDe7BBRXPS5Xm2crKDXbkObpyz5YW3emeM.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-VDx_88VswgjBUKMdx9yR8BSzpw49feKfD1htPefzcvM.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-bTfU-UL4HtQQ-74FnClCF5kkIwGbNxQ5FrKWc60_L90.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-LPFijZvDxlDGYeUORlt-pPChE-Or2lPLaqK_BEdt9ew.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5249225679430433749.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ArN4zFyQrRNnsnJQ60rFONJGV0EdMkyCzMqpGC__Gac.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-xdLQi32m6BlwApXBwrVv4KxliUljLTA2-UNtxsZRiCU.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-aridxMTwxSq5czx9PegWwRrFVE8BMFf-8YFG09BnZ1Q.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bKKmgxvRTMtobiazKScpMLiB3yBjdddproMLP69j580.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-lWDPoE3HUZPKLoCzaR-EFLByT3xk_BTzTUEELOaxuc0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT--ArfOt18QtvBbndx5xZVcH4PHOuTxWFyzSFqbWGTipE.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ZoX8Da7UxgN1uKBW102Q_Xt0rNj0P-m2cyJYcXNMaJw.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-VD5Lg8Gu61rapePu8GQfjnkGhv3JeF9UfZMgenzakiw.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-y1LueXSHZr90YThR2dsnzpor7a6vFG6Fm94qPxtVGy0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-vzvIKcmEC-qrxEuCmTgeZ1slP7VaxqDGR2N6YXZFWeY.jar
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 794422c3b772972ff6966fdd5c33ce33096730f8f1b1a374cecd5822dc3176a8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eUQiw7dyly_2lm_dXDPOMwlnMPjxsaN0zs1YItwxdqg.pb
    Sep 10, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_17_45_35-14607885282040864557?project=apache-beam-testing
    Sep 10, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_17_45_35-14607885282040864557
    Sep 10, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_17_45_35-14607885282040864557
    Sep 10, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T00:45:35.109Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:42.851Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.833Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.873Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.905Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.982Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.012Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.165Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.204Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.583Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.658Z: Starting 5 workers in us-central1-a...
    Sep 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T00:45:52.259Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:11.963Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:11.993Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 10, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:17.378Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:35.306Z: Workers have started successfully.
    Sep 10, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:35.348Z: Workers have started successfully.
    Sep 10, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:47:10.396Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:47:10.559Z: Cleaning up.
    Sep 10, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:47:10.646Z: Stopping worker pool...
    Sep 10, 2020 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:48:03.739Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:48:03.779Z: Worker pool stopped.
    Sep 10, 2020 12:48:11 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_17_45_35-14607885282040864557 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 74765804-825b-455d-99f8-4ab6324d6e5d and timestamp: 2020-09-10T00:48:12.038000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.627

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:48:12 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 51.836 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
106 actionable tasks: 71 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/jaa7vtf6n3hpk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #975

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/975/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-10860] avoid dictionary size change when shutting down

[Boyuan Zhang] [BEAM-10863] Change encoding of Pubsub sink to global window.

[noreply] [BEAM-10864] Update Snowflake JDBC dependency (#12793)

[noreply] Merge pull request #12709 from [BEAM-8258] add more options and

[Kamil Wasilewski] Fix Python formatting on master branch


------------------------------------------
[...truncated 296.60 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 6:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 6:47:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-YICBu8Grzf9ZBP6j7tWsC-0GJDZ5oyUnJ3DsskfYUhw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6544634911890019447.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-b50ifWgrih565NhpjBxNl5td-JJzUb1bb79phh1UANY.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-j9nh2O5mQmmjbh6n0TVBS95IXAzClgxNZseIHnVIOTU.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rESy8CvF81t01JAwhWWL_bN1H1KcvkVrRFPhtrxVYcA.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iPCA6BHQklNZng1R5i1GsYmf0bjvZKZh3ucJUgITlwI.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-0awVRcQC6KvnZNP3aq8p8Y3ZpodG5QcmKtTYKno-SdI.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-g-y4-RILke7zrCB_82SPGyl0V2VlYBdoyF10LHv3BXA.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-vszIle_c2zNkfom5DG7NXZtII_6DX1mHPDsmoP7eRY4.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-JmL4ryGFQyhkByNxPoObOifl5B_tQcQako9ODbUrJQE.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-QQg-6FuOsAvN7p3XxLnJD5I2DHOKKfgOg0ChvJdqQ0Q.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-MYI9N8ufg3GBrdIdGqPyOdN29L7gWxm99zc6753ywDc.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-R1RjFPIUrc6oKuFCPlpUA3QxUzMdQR_EN90Zshx8S6I.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-pe75t5rTVRfvkx0nORIuUEiR_JlQVm9NCLskl0hAv64.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-g4aj9lucu7gA5Ey2gN59GyZ0cn9RdEwmSdB3Zu0Q49w.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3Bf_QdwsFOud7itIr0rITzUI1Y_lis9vif1xxyr6Xwo.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-KXFOJaTfMtnVyKzk2WP8fHKbAtQ1DBmBZHJQbitS2R0.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HGnSys2D60zl0XFdKYPJNxK2h5RNeeNKpveescVsjZk.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-WEmDh1ECPtaZEdw7vWnmjaCbPe_Nplz9U8Lo3j1T6l8.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-YICBu8Grzf9ZBP6j7tWsC-0GJDZ5oyUnJ3DsskfYUhw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-1wSkz1ZJT3f0xRK1iuS2s49PglCU2Z3NHSg36OAq2Lw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-KQKFAm-Y386XDpeXwK0n-TMGV8zVKbFHxjAlfdkV9jA.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-odbrvcKNrIIb8i0N8O94xnzQ7q8AuJUZyNpS6Dehx-8.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-z5dfzKjZb7zJyXmg0HSYKzJAQT8eAdS04mq4qG5KCTw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Ai-D8pESDu9NxiioQiMX9mTl75szBbPnvf6Wm-XhF5Y.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-hwbULFSswaQo8lWrBQ3NrNHLnYqvvgeV8RkBLcZsrvs.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-opj6yJGAHV98_jEDmVRNXPuJgnkFr1K0ne4WfyHv-44.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-RlUuyS_WAnHngyBL0TDvvq3I1UrY19qR_sqeEYTix9g.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-eNSjz8AimCTDpK-kYFSucg_k2rVVI5NAliMH-KCNdOI.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-QEixvbmWsXPlxde-LTNtycgMD2TqMH6leH14kJQuunY.jar
    Sep 09, 2020 6:47:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-A2WYqZ6SSG-t3bMXTZyELU-moCz3GxDi7uWpWga0XHg.jar
    Sep 09, 2020 6:47:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-l7gSN4eMtymTqqWU857ISG6q0IF5kI3GkS95_tnHXX4.jar
    Sep 09, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 5 seconds
    Sep 09, 2020 6:47:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 487b9512cb92b01bf9e00dba85d509e2bf9e3ead50500aa1fe68bffb60b88465> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SHuVEsuSsBv54A26hdUJ4r-ePq1QUAqh_mi_-2C4hGU.pb
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_11_47_16-1923284464522544724?project=apache-beam-testing
    Sep 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_11_47_16-1923284464522544724
    Sep 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_11_47_16-1923284464522544724
    Sep 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T18:47:16.726Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:27.421Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:27.969Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.008Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.032Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.097Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.124Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.160Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.197Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.540Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.618Z: Starting 5 workers in us-central1-a...
    Sep 09, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T18:47:46.546Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:56.929Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:56.963Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 6:48:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:02.410Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:20.972Z: Workers have started successfully.
    Sep 09, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:21.009Z: Workers have started successfully.
    Sep 09, 2020 6:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:57.555Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:57.723Z: Cleaning up.
    Sep 09, 2020 6:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:57.809Z: Stopping worker pool...
    Sep 09, 2020 6:49:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:49:46.840Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2020 6:49:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:49:46.871Z: Worker pool stopped.
    Sep 09, 2020 6:49:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_11_47_16-1923284464522544724 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a21a6231-8f8f-4846-b567-7f878b281bb0 and timestamp: 2020-09-09T18:49:56.313000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     17.95

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:49:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 9.942 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mzwwku6bafhte

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #974

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/974/display/redirect?page=changes>

Changes:

[heejong] [BEAM-10791] Identify and log additional information needed to debug

[heejong] share histogram in a single process

[heejong] add tests

[heejong] add comments

[heejong] safer locking

[heejong] addressing comments

[heejong] fix tests

[heejong] get atomic percentile loggings

[heejong] add tests


------------------------------------------
[...truncated 295.04 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-OdKvzRPQrYh5wlQ8eDU8m1SE9V14TR88wCcFRsrLE18.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AXGgRQ7p98SLFkRPBKTLAUF-GOy6v43W4f9c4kX0qqc.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-MGxPRMGqw-JEJP9sQE-r1UA3iffcdgV5GQLZBcWJ1MU.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-8ZOEWsW_AwkA91gme3yK2EEJJgRkHwDeDrYIOKtyDP8.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-OdKvzRPQrYh5wlQ8eDU8m1SE9V14TR88wCcFRsrLE18.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-2iDS7ZZ2_Vgllp-GdBlBUzcr699n5XIuMJQ9f52wiO8.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-oMCyP7lyxK5qlQfhVikUEdY_yrTgPGvyjSvp5JCo5dY.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-WGfzeIqj7JLNTo1RygYk0zBpO_wv0-Kkzoy0R0pTs78.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-FhBg6ADxeRn-_PYs8pD4s6GJBIIpzEYaaTelHrirTig.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-oK_N73F3v1YrAp_azAI6Rmoz8D5yBm8Kqy7CGxde8qE.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-hoWA2DKVBj_4SNlYCYcyfXkJTqP7TVbTEMpwaSaz3oA.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ucTDth7iTBG36nNjqZZJu2NRFGcp2iLSUwd-n_S1qr0.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IblYMgPcm-HeEQkq8KynjqMCmRdIFzo6ah8LAnsBL60.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-aL_y_BbnsuuF2vkC61KsoJR9AghTK5m9NAYZTX-BlI4.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-x2dkbt-wVHHs0PabIIfroRMiuG0dv895e3pUyq-k_Ko.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-vWktW_K3gG1ak4kUJYW3GocyoT03efuUpzDjomiXgBI.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-kXbF6BuPfxfyF_eGQoQR810RKdnq782on5lR_Jo-UQw.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-p3tS5p-6-X1bTOloKL8HQq3_AXukBSRuuOLipDtG_J8.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-6RbmSOf-xBUZw-_Ifr71iYZ6iyl9fl6Y-ehFbKQxsk4.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-uiyFbq4LZfKcaaLGGBiAQR_qQqWE_bkXkXHiUPhHTCI.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-8tC0lKyDTFEdbNRuPy3Po-2hiiRfehtkRxyCymCJMzk.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-bwhZv_BB7KAbGk4J_YinXwbxWHNbahtRoNsyE4FDYKY.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-z4m3CL4Qp9_YzUuSs4nd43Dq0Z_GYaClKXNewTNeIAs.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-cT0ftUQUGqv9q2QXV11X8okwmABv40epV60-1VnM1_U.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2531929208300393189.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MXmx_nt6SqSCd4EP1bk8xluk-wHAQaJIzeKWmmQeSKs.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-tsAAV7PcNdxukf9Vu7GY38PofGBVCHeM6fOakW0q1T4.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-xXhvavQInHdkvFiVtfBXj_rf71CwvZlrmzb7jUzUHdc.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-LSU3yFrdU2-84O2tiNefDkWJvCZR0UNJzI0Y94uIpWU.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-RhFoNMWgsNyuV35Zd-4Zp3zPGS_jxDrjIXZrFwngKqA.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-r1GsKTQts3pFw1H7fHV8n0Psde8emqFT5kgNpsO4TtQ.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1kjWpwNvTIokn1Ttrse8dfWqiJP5TmLx8C7k9PMTmi8.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 5cfaceefbec389debaccda845b8408fb6acda2b16e433000e41572c305b696e5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XPrO777Did66zNqEW4QI-2rNorFuQzAA5BVywwW2luU.pb
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_05_45_24-7788045200651798350?project=apache-beam-testing
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_05_45_24-7788045200651798350
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_05_45_24-7788045200651798350
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T12:45:24.245Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:34.400Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.427Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.458Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.483Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.541Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.565Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.597Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.636Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:36.049Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:36.125Z: Starting 5 workers in us-central1-a...
    Sep 09, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T12:45:44.671Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:07.426Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:07.458Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:12.765Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:29.258Z: Workers have started successfully.
    Sep 09, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:29.290Z: Workers have started successfully.
    Sep 09, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:02.167Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:02.350Z: Cleaning up.
    Sep 09, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:02.438Z: Stopping worker pool...
    Sep 09, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:53.240Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:53.286Z: Worker pool stopped.
    Sep 09, 2020 12:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_05_45_24-7788045200651798350 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e8e6d2ec-5a40-482e-9eba-61fe1e23b724 and timestamp: 2020-09-09T12:48:02.969000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     14.45

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 52.193 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yceqaeclazxmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #973

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/973/display/redirect?page=changes>

Changes:

[noreply] Support updating from a job in DRAINING state


------------------------------------------
[...truncated 292.83 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-E0DGqBv4OhSM1jtLCJDybep9X6Yp0UqerWN93wt_mbU.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-VEXmG8wUbmix_xCD3JoWgn0wBJRnR5s5XGNCSCbGV9k.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_c_No_hLNBMauIRky4YBlly-Ino7PqFaEyfI_GfCuwg.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-sC3XgjQ5qHQRBx_ywfUAaSd-R3xVCj00o9XPcufhwFM.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kbUG4CMLpoKnIWKKrj3z8Od6e7fjhAWty6hJqJnUZHc.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test271759568709461049.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zcLc0EPn4yceAlfSZVdRMB_6UR8kscviExOj47YRT3k.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-gQxHnxNxs6XjsPJFwUK31ZfI42b_oMFzyKlkBduHNaI.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-RjbtOnbaQl9_eJTzo58Lr5FoaDiqaC4875LvqXEyQ4c.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-j1W8HmKBsV_0_F5qjEq4RJTE9mflh35ZDVt3fpRQxnk.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QnW5JGLW7WUuFR6ppgEdcun_vr_vhrH3b4qS7Rn35_8.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-FIbdluBN6KoDXceE1nJXASgi9eHe6f-Hn0TkSqpumDY.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4Xtl9Nqf-nO4pdkkULkd2_csZ-nd3tA_6mMtG79BXUo.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT--D46nHJf3zIf76m43NisOpsnaoNJVBU76UBREOH_Fdc.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-OnHTRHIR2O-V09zsaBs3WAqHtVaKM4XhiD8sHN7WPs0.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_yavBLvKSzaXNPO55LFPjlj2ZZ9Jb7uAuNbROnEobJc.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-bxNkTW3ViTIr4EYAlleUPC04350FFXEeTFiuKCXAG1Y.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6yYl3O46aJUBu81iQCLrXIKMhAVHhTZVeN8ArVoeFyM.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-TeEvkw1zj6MMAojBk66ULTEEoTpjbfiKMXmsxBayG68.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-E0DGqBv4OhSM1jtLCJDybep9X6Yp0UqerWN93wt_mbU.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-fdmEqtq_uiiAVgr35TyowOUocdNs_meb-iKJpRmkQAk.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-7WKHfHF8X8zKyXNAM7hwbndpaaQbFttT2DBMdxLCo9U.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-PK_HTHaz3nZCH6s3WuhQhl1cEd3D-2M1MzEdEaJFVxY.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-EJq3BEmeJEiH3dE4wkwHpeDAPU2o21u-dIvfEHhSDaI.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-FftkW4qA5BaVEX4FFwJuduMTfWzEMhgu4JuGt3fAb1A.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-k9v53bvtfb-5Cfula1buWNyOKXuUXK3_oKqt7lt1Rq0.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-BjU77KGNDPq0bMCzYH7zfav7hISM1KKot_jmWP0W4x0.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-NHKqjh9q_4SzoDuXPfQJdYSerIlyS0OdaPZyXEw2ey0.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-HPiUxpsN-kxJpoYSGXE_cwDUCs63SMsktEbtI_Xg1MY.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-O8dsfVyPSa_6kVnisNYLj35t1C_hotwmrnnRblLAUeE.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-h5pE_WCBJsWaJ-TfTnTwEqYxHUl2xX18xzWJ15zd60c.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-fL6Cf7qc0oMhi3lHubxeyPd-sCWgtgVY9z9flL7b_oE.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 67f19168180814e0ef03961b2cddfcad260455ffeab6f204e32ade05ca8aecd0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Z_GRaBgIFODvA5YbLN38rSYEVf_qtvIE4yreBcqK7NA.pb
    Sep 09, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-08_23_45_40-11621385688081722282?project=apache-beam-testing
    Sep 09, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-08_23_45_40-11621385688081722282
    Sep 09, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-08_23_45_40-11621385688081722282
    Sep 09, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T06:45:40.289Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.070Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.835Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.888Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.925Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.007Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.044Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.077Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.115Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.527Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.599Z: Starting 5 workers in us-central1-b...
    Sep 09, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T06:45:58.825Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:22.269Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:22.322Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:43.011Z: Workers have started successfully.
    Sep 09, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:43.048Z: Workers have started successfully.
    Sep 09, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:47:16.037Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:47:16.196Z: Cleaning up.
    Sep 09, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:47:16.294Z: Stopping worker pool...
    Sep 09, 2020 6:51:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:51:08.115Z: Autoscaling: Resized worker pool from 3 to 0.
    Sep 09, 2020 6:51:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:51:08.158Z: Worker pool stopped.
    Sep 09, 2020 6:51:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-08_23_45_40-11621385688081722282 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1cba1b82-be4b-4760-91cd-f6427a784fde and timestamp: 2020-09-09T06:51:16.431000000Z:
                     Metric:                    Value:
                   read_time                     13.74
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:51:16 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 5 mins 51.753 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 58s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fqwfyydrzt7fa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #972

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/972/display/redirect?page=changes>

Changes:

[yifanmai] Add support for multiple inputs

[yifanmai] Add tests for multiple inputs in PTransformOverride

[yifanmai] Lint

[noreply] Merge pull request #12703 from [BEAM-10603] Add describe and cancel to


------------------------------------------
[...truncated 294.42 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-i38qfRYOLPeUtCOy0ORsJZyDxSvWDyYBLvnim54xMZQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-a5TFoHMTU7Z6PP244TyXp4RH4my-vbMmN5bbwfXKm7Y.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-_9DUzQM6nf2gy-BnorcOqEHHOsq9z5EOm5NmI0WzSkE.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-KTojdAYfFOSCZcl1o3FwGxu-Jpbyh9VYa5hR4TRJMK8.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests--6aPEBBmMoaXj8jFqUnXFsXHCGHKLaHRgoJfZTMJF_A.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-SsehBZhG81g6wDffrNth1SgzHtApPIVQj0_tkX37DQQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8688062741569456157.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2EhycPecsmjC-Ej-uFqaTETV1UBPRAKnfxGsjxqyM4I.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-zpuru1y-wcY1g4vrDPB8rx5k21X7foGhzvi_VtpiA20.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-OuP_feNu43EpN4bpSusFBM1VeQpOvm_x_7zTdxw4JY0.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-KU1MK-d4p5rSF_bRaqpsHYY07THVINpUVXym07NVUNI.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-zBj_QgLq6HEg3DMifZWIJTXCgkIBxi-Ttq2EbP1YVys.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-r1nYzYVGme9xDocFaAUWabzqHNy2r3qpqSkLF6MGs70.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Cz3GZnUY-aSg9PIFHy0ba7-WdjmenKTR8H8er_4-nGA.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-EhuAGLRBzmf_caw1uh0ps1W_qupbHiaFpc7C_b2cEUM.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-i38qfRYOLPeUtCOy0ORsJZyDxSvWDyYBLvnim54xMZQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7yBdrdl02MH8hg8JrNmT1jhB3onoWJDn7wzkSEBkHzs.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-uHRMmwtmOKnCQpoa-ruqTDPhArPtEM9HmsyVD9Ok5uA.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-UQhd_z069CEElvBrE_QbSU_ZTz_qjPtqTQDAAwCUDvM.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-4SYVIUTbKCMhIOjAOrgsZ6-6lwksvsXIz4Wchd0Bzl8.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ygkA0nGwftuahAq_AxcLJyw3SBlYNRI5f2zOVEByjX4.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-j0tongq0ce2olJqsSXm2Dp8crvppbZ16eq26c0MvViU.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ycBX94_AtGR8C--wCSWLgtXaXFggplTqO-XNYuBhf8M.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-EY0di0mg2ixgUzguBn7ZTv6au_5_oGB-5-ZG0IZlFkE.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-LlPQXeVYznnG65AtXv0sz_60wwWl0ZGEl4yVLreU4SQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-TgG0Ox-WxA_E-MxQJjhA6jc87BLK5e2Lrnq_GqlZtc8.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-x2HcHi48ciKbtujFsYHuMeZp8iMD3I61bi40tofCoIA.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-fDx-FsawoRy8F6eF8ItsMJZFGYL2WgQkXYfzfPaAkoU.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-lAwSY-GP3YY7HspN5wDmlR-Tm4opSEeXO3bsBi9jC_Q.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-pULAr_gYt6_pU3-HG-H9QV01GYekbx2tRokeu3UihTU.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-9OfWeZD6bcHXjnJM_sQWOW4dxPy7mwam-0DU1mIllck.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-8ofOmm3jXpRNjiUpP-Zf2IXprhXX6OXQol906IR44bg.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash fddb4c588fd381e6feca16cf225c83dff90918d58f89c37d22756677a8a95c72> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_dtMWI_Tgeb-yhbPIlyD3_kJGNWPicN9InVmd6ipXHI.pb
    Sep 09, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-08_17_45_37-1554375378502726365?project=apache-beam-testing
    Sep 09, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-08_17_45_37-1554375378502726365
    Sep 09, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-08_17_45_37-1554375378502726365
    Sep 09, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T00:45:37.216Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:47.463Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.567Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.610Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.634Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.713Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.740Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.761Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.803Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:51.218Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:51.287Z: Starting 5 workers in us-central1-a...
    Sep 09, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T00:46:11.941Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:23.543Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:23.576Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:29.197Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:47.967Z: Workers have started successfully.
    Sep 09, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:47.995Z: Workers have started successfully.
    Sep 09, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:47:20.458Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:47:20.610Z: Cleaning up.
    Sep 09, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:47:20.685Z: Stopping worker pool...
    Sep 09, 2020 12:48:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:48:24.140Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2020 12:48:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:48:24.187Z: Worker pool stopped.
    Sep 09, 2020 12:48:33 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-08_17_45_37-1554375378502726365 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ad4a8d9c-cb1c-443d-ad70-67042537526f and timestamp: 2020-09-09T00:48:33.357000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.311

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:48:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 3 mins 11.272 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yi3otqvokszdq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #971

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/971/display/redirect?page=changes>

Changes:

[saavan] Update codecov.yml

[Maximilian Michels] [BEAM-10760] Generalize state cleanup optimization for global window

[noreply] [BEAM-3612] Augment supported container types. (#12785)


------------------------------------------
[...truncated 295.43 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 08, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 08, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HCNilZKrOQDt3z2ZA4s23k_qexjd01OKbxx_Y3Px--o.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xQBEYwjDPTJrVuPoPeWG6FdwCNxG6sSNwaCoki-9n5Q.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-2V80CCp8ahnbjccpfDAbczhn1XBryW9o9lSuWcoufFQ.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-sshvGo_Z6tDiLFfFqffrL1oTbV_CG2Mx-EWhj-gSMpc.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-wXaqnZH4rsguAnlQMGFmAisUy2hp97hXHPmcCARMikg.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-GGHjUDxlaPWhfN0bI_ns6xKQClKljlSP2x1x-6CteZM.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-jL262icgw7MupV8Gs6zGPpNSOC4i_xrbj-QfwwtK90k.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0PjXZPtSc99SJLsV1T9ZowuBNU5oGqO4XH0dAMTFavw.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-i8CXjsMHm-KD8CRRiSRLOYbYUzsXUrbT7-bmH9Jg-9Y.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-NIuxR_kJ1CwNII_cbc9IPkYLHJyJYTKLXRXSq70r-0E.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-LITSFnJzyUOi_-fEYy1WKpmQshMeFqgOA48FxvPbVlY.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-37MEHplSzZ-ZNBGLJNKCHKJWyqhxM5Kqrpp6sIni29s.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Nvk0AJLZQlQYnhestP2jG0nzrKHr0Dwe5pNZYx5G974.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HCNilZKrOQDt3z2ZA4s23k_qexjd01OKbxx_Y3Px--o.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-a-ZIhtytmv8DHHEPaYJGUvR0brA1nktQgTb2pepXq_8.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-9NNDbGptEw8XpO9-BycAf_qthsmj6HygeqHRe09U8qY.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-UCNC6baXruj24HRJbuUZ4HObyzx7WONc9cLnmvSkvjU.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-xQBbvJNuwQ77aN7DEi4C_O5NOFg9Fdmt1HrvQVZbQN8.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ufC-wcG7gdjY_l239FlauodkN6bPrvACPhpVD06_-l8.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-NNDey4OFaw-JYE6Snw46W8AB7dzgCMDWRXTVkDf9pzY.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-01NjS92jVJoAqi3yRqHc2e6vaHO_2iDoMOTe7haJ0i4.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7373790592523026449.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-n-exin01r3YMrimXHHE7icGHzqCTsoZagFQ_3fzocw0.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-LtGC3cd14Gx8mKBdWvynGKjw5oDapdG-UpFcF7xSoUA.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-p3lKVEE0TsyXM28k9nu26wmBJMrpTCF0AEA-g9mQdL8.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9-A1xS2rT62lak4_mGY550rtjg35RIN2fLcCG5QC2fE.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-MpLfEc90jiwPcbMh1ulAii5QzBKcUCv-iJdEh7Z5NFw.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Fz1oHiZVK6YyUDGn915m3EC3BfB5qSajdIj1elPuy5Y.jar
    Sep 08, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-FS_sGrhF0NoYseUHi9wrWIx_a2FzpcOBPdywswHKEjQ.jar
    Sep 08, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-suT82od7lYt39fwr0vCpjxJWFLVPZHPVhfM5my5Bwgk.jar
    Sep 08, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-heQZDNCzt48NTFE_9M9OqRvcUfq40G0R4_8iJj1tIyQ.jar
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-3IlT1SOci4xSTMfxwTqnFxjsrGv-knWKpCnfWaMPpMw.jar
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 2 seconds
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash ee7ecb68f52198de84d869c4b788153bf0ab98b28d90b4901755fb6a3551847a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7n7LaPUhmN6E2GnEt4gVO_CrmLKNkLSQF1X7ajVRhHo.pb
    Sep 08, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 08, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-08_11_45_29-5650897857434488991?project=apache-beam-testing
    Sep 08, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-08_11_45_29-5650897857434488991
    Sep 08, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-08_11_45_29-5650897857434488991
    Sep 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T18:45:29.413Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:36.673Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.344Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.384Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.421Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.490Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.529Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.557Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:37.595Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 08, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:38.039Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:45:38.115Z: Starting 5 workers in us-central1-b...
    Sep 08, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T18:45:56.951Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:46:09.554Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:46:27.519Z: Workers have started successfully.
    Sep 08, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:46:27.569Z: Workers have started successfully.
    Sep 08, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:46:59.633Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:46:59.789Z: Cleaning up.
    Sep 08, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:46:59.882Z: Stopping worker pool...
    Sep 08, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:47:56.985Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T18:47:57.028Z: Worker pool stopped.
    Sep 08, 2020 6:48:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-08_11_45_29-5650897857434488991 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7fdaa4df-aeed-4289-bc0a-20418fe0c6bd and timestamp: 2020-09-08T18:48:05.090000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.482

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 6:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 54.345 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/2r3zbih6amhai

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #970

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/970/display/redirect?page=changes>

Changes:

[Kamil Wasilewski] [BEAM-10672] Move StatefulLoadGenerator from pardo_test.py

[Kamil Wasilewski] [BEAM-10672] Several fixes to init actions for Dataproc cluster

[Kamil Wasilewski] [BEAM-10672] Add Python Combine Load Tests for streaming on Flink

[Kamil Wasilewski] [BEAM-10672] Use customized image tags

[Kamil Wasilewski] [BEAM-10672] Update Load tests jobs README

[Kamil Wasilewski] fix: separate job for building Docker images


------------------------------------------
[...truncated 292.98 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z5ZxIWdJH6O3AY6UuHzNIFK3B1ELgaZN7dI4hDNh3Nc.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-sTwtAHuPIgPZFB3t91hWVAdmCtXWA897PwdtWwKXX-k.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-a4yT6m1W-8hZQ85AcqEUlbofWnkRxMULJkCkTzNdrm4.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-rMVT5o1W14EZiBkXWm1qT9vOEdGefdBijtbbnw4oZRc.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Hq-kpZjtv4YrijHru_--58aNJ9l1qEr_y-AUiE85jpc.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-WMT6cfhPom3DG-FutZqwmZdKOGqh30aQdnGhQpTsOyA.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-bVFGwCO9BRc5KbbxY6JVF1TcYzt23UwrPF6ayv2fh4Y.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-THgm30OXowm4rAvrqAHRIZMdMrlAv5VJcSzboMmWqHQ.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-KIVc8DP6kqOLWQb5fvPp6cCnm_oeEIaYoFq3BB-8b7Y.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-LmIR_CkrB5ZksaT8LEapJe9qYBk0T2JFgLKRBIM7W0M.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-XwmrWkK6WpeuLjOdBuKXca2aU8n-elMPAUtMy7M3jO0.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Jz--fTgSjYpd_Eptik-H1nMsQlNtnmAKbODIC_UXJmo.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-S-JI5PEgua8ZonRhXDYnqQRnlJQ97eB74tsZbrOaGi8.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-paa-YwWbV_X6usQVqoLD9a3PtaG_WIcTNi8VE0wKNRg.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ZHat2YsefmEi6UUVC4cZ0XHdgagh6BoXKartQrexyQs.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-dya7X3HwaToHsQh1uj1ZBW84pXAqrafZ9E4GBpKkrGA.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-57_AWHt5GBiJlc6Vmtaik-Iw3AWI_pxC787BCiM9M5M.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-ZBe2KNUE_kZ41M_yCEXN5XLznpuV4PikyYnipgrNYqI.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-eXQSRUPBKFEGd4oj8-GEWGlQZu2-3ydfrGkDIzeQb3Q.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-906_mdNWJ4KaV0EDx4HpXQTQ0a5CuQBAqJnkLQafnsk.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z5ZxIWdJH6O3AY6UuHzNIFK3B1ELgaZN7dI4hDNh3Nc.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-TiBfBarHwJGcypKi8CvNKXOK7JF9cnMviI4GT4bFFiw.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6296620310717304652.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-efkdDmKQUdfBWe36_8SZ9LmL0QSQnpGfT9OQOtuWGCk.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Gy_Jn1H9LgJv-oFy9OaZMo1IqYm9boBM2h2arckmFPM.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-KCz9gW1tbvJ4lIT_laBqp00mci4nwOHXkxECsQWf8kM.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-N_UtfmK_8YNLaCm5XePlGpEG05Kx1q9n8gnivmWhlvM.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-qbYChfAd6fPOe7dlZxn8j1noMBz7B2VARV6K69xuFhE.jar
    Sep 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-bpJLsC18W6ZgE8nGE5grp3MHlCSqhThUBgBePAFyRjo.jar
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-agTRXr84fgwVw1QW7GjvYX4vbCbflYiHYefnqxsVDVw.jar
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Jbfv1KV1x3OgOqKacDatekplYgey439fHOEmpxU94GM.jar
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-rpLcWltodScQDhm4VEn_xC8QZ54e_x8xEdoLzvwha38.jar
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92061 bytes, hash 22d9be46e791497080976c1665ba24362947c1896c8842486666279e32e3dbeb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Itm-RueRSXCAl2wWZbokNilHwYlsiEJIZmYnnjLj2-s.pb
    Sep 08, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 08, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-08_05_45_21-18105873312394791471?project=apache-beam-testing
    Sep 08, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-08_05_45_21-18105873312394791471
    Sep 08, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-08_05_45_21-18105873312394791471
    Sep 08, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T12:45:21.161Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:28.688Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.375Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.417Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.447Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.516Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.547Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.581Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:29.613Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:30.054Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:45:30.123Z: Starting 5 workers in us-central1-a...
    Sep 08, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T12:45:52.446Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:46:03.574Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:46:18.132Z: Workers have started successfully.
    Sep 08, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:46:18.168Z: Workers have started successfully.
    Sep 08, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:46:52.127Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:46:52.279Z: Cleaning up.
    Sep 08, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:46:52.371Z: Stopping worker pool...
    Sep 08, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:47:50.916Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T12:47:50.967Z: Worker pool stopped.
    Sep 08, 2020 12:48:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-08_05_45_21-18105873312394791471 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ea0db0cd-5e98-445b-8a18-e9f0f723bdf5 and timestamp: 2020-09-08T12:48:01.369000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.144

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 12:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 54.142 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/vnosnkis3neyo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #969

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/969/display/redirect>

Changes:


------------------------------------------
[...truncated 295.69 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 08, 2020 6:47:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 6:47:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:47:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 6:47:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 6:47:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:48:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 6:48:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 6:48:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 6:48:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:48:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 6:48:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 6:48:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 6:48:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 6:48:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2020 6:48:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 08, 2020 6:48:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2020 6:48:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-DiGQ-1gF0Q3fVQhPjhWtF1NRoBphVrIiLRFzGUt5Bd4.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-mu4GaeUBDOmOzswP0Xyq2TEqk6dI__2O4fmI-kdGi5E.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-GGeM21ZNnFb-ODwiz4xOd7BoEAeavPU6etCGOOrESro.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-RKrs43PEazXnN5-80nkowlwv4LaimFvLbq8rsdk-JXA.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-F-mUFmlh9j938Cc5_ItgSNEUPC-OzyiuxZRCWY4dYqo.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-aD_PYbdalf6ix0-xc0zLVH42aNmvpTcGEp-bZUV1sn8.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7077540766455928530.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0aB79QGPGcCCSPMueb8TFbn2iocxPVSALcAK8HemhCQ.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Tc7BhQYYLzE6HZBPUvVqfJcyNT45kM0wGpoMbyCsB00.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-DiGQ-1gF0Q3fVQhPjhWtF1NRoBphVrIiLRFzGUt5Bd4.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ZhQXJtO_mwEUycpoX4xMa4vactKHr23XRB6AngTodFA.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-wrCadZ6Z1qRmT4cRNSDblaelglxFslomhU_dzuu41xs.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tNo6hAehil8YL3oQ6knjYPjSz0kz2DodqtkuqgY0rHA.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-8Bkzf-jTxF5m7AJAFD01C4V6Q-FN6IbJ7_Bt63ZdQ9k.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-W63tplrMyWN1bxeYZdw7rDdpKgRl-lvTv1f-yJW4wuQ.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-nV5vTOWCPAOMWdZYZCN4zogn3pziiMlO1HT85sZ0eD4.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-PzyOx0dcBegUI3T4LiOK5cuDSK1GpyJMZmHolQy6M8s.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-P4FHcDuA8EiGn7KzH_C6ldtg-IQMjhl1rocBnX9fdkM.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-XUVTjtQ2no71fo4atfjV12riDW_7Htcf5u8hw_MbVas.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ic7EV8HtQ0rZTBDvy2ZgPsc_XWFKLnVt7ZbnoZImYsU.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nJiBrE6-Dh1g0bIpdZcJb8mY4Uex9_fflpTrlaUssW4.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-e-_HSVkXPtEkve58mKoV2u9vGziHqBQezlAdJ4Ly4AA.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-_qM4KInySZ8oZFKcb7waO7RRWuky7jdhe6F393bJoGs.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Z16oLJDt53cmQddGIkbyUMRi_L_mS35s4rHgbd3BngM.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Rh1yAcbVhoe_UIFzqYtN55jSl_N3uBvd7_q1gLSDhNY.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-1AE170X45g5--5RUVY657HYL4yJWUWhfGRsoWCEdaZA.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ODTIfUj3wrCoJOePW8B6up0ooZ3h3MpV_vItSpWM1Jk.jar
    Sep 08, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-T6sn_S7NMbhSisOgh30XfDgEPCiTeeddB3ms507y_mQ.jar
    Sep 08, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT--4aK_sGWdHi9wJ2FFjaGLNp3u0L5-3QXz_xyfKM4rIo.jar
    Sep 08, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-7TWslxo5itZwSw1ahd-dIlzm6aqfI9tgOjnKfpcPhwg.jar
    Sep 08, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4sNBrihz2bCRbpavyfa783fB0Sb4mEkHI_7qw7HO2ck.jar
    Sep 08, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-wEBOpLqzAh-vbYYcvcMyUUphpRBm-_6pNPloeHMlIhs.jar
    Sep 08, 2020 6:48:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 7 seconds
    Sep 08, 2020 6:48:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2020 6:48:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 08, 2020 6:48:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 08, 2020 6:48:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 08, 2020 6:48:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2020 6:48:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash c3517b81b8d1c524c2eecd062806d55e2385910d342653f917c7460ab349b885> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w1F7gbjRxSTC7s0GKAbVXiOFkQ00JlP5F8dGCrNJuIU.pb
    Sep 08, 2020 6:48:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 08, 2020 6:48:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-07_23_48_18-9560300325878500362?project=apache-beam-testing
    Sep 08, 2020 6:48:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-07_23_48_18-9560300325878500362
    Sep 08, 2020 6:48:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-07_23_48_18-9560300325878500362
    Sep 08, 2020 6:48:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T06:48:18.868Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2020 6:48:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:26.701Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.562Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.592Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.624Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.698Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.729Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.754Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:27.787Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:28.142Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 6:48:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:28.222Z: Starting 5 workers in us-central1-a...
    Sep 08, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T06:48:35.909Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2020 6:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:54.961Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 08, 2020 6:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:48:54.998Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 08, 2020 6:49:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:49:00.418Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2020 6:49:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:49:19.753Z: Workers have started successfully.
    Sep 08, 2020 6:49:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:49:19.783Z: Workers have started successfully.
    Sep 08, 2020 6:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:49:52.148Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 6:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:49:52.292Z: Cleaning up.
    Sep 08, 2020 6:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:49:52.369Z: Stopping worker pool...
    Sep 08, 2020 6:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:50:52.458Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2020 6:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T06:50:52.498Z: Worker pool stopped.
    Sep 08, 2020 6:51:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-07_23_48_18-9560300325878500362 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 368fe98b-d6d9-426d-9fc5-6747c11b25c7 and timestamp: 2020-09-08T06:51:02.406000000Z:
                     Metric:                    Value:
                   read_time                    14.419
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 6:51:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.069 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.072 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 29.31 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 19s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/n37fnvbigbx4g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #968

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/968/display/redirect>

Changes:


------------------------------------------
[...truncated 294.37 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 08, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gQ_m01TP4lgRRJwu5W76LNuWGB2YunMe0VDxrurgKEM.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-APVyoCdeWzBtcHGpibR2TWnccbdtw_dOB22k5DniQ6w.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Hh4Jcmo7UvGCgUDc4pz9NGOMTCyfhUfkF9_nUItPAho.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Simx5jVzKHE0tRgtPfbG4NLBk8ATWqX00LjUCEp3-rw.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FrXRGf3SrUCNcZcANklyb2nOD2bHXnvnA4u_ojoLi7Q.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT--pxd_hr_PxgrWQ8OdO-Q9qYSCz_MzXahdSCcnUEXFJY.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-FBOTPDV8FfgCIpuAezmc4Fv8k6p6eJpUTJzweRiqqMQ.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-BjkM6cDXrbixMFsv-W9xHL_koHm_Isou7rmzrm8OAdc.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-XuvO2MQJE7FYjJTEyJyEUznLzUhxAIyhCkDMTIrWdc8.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-3DbDq5Mt2MC2BMQBzBZa8__So5_nLslLySKsWpcORJI.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-TtdfduYxENhGt99knhub0tGlb9GBnycU-atD9fWtzoc.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests--jIwXXxqcMncPrllW9adoFcjZN17AqzKNWzV8U3NYGU.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-9BvF27mmpG6ymIdu3V6CvtC8Jyf-zjo7509xHRLgRyg.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-W1NP8iGs5HeWYWXiz4x58NlwaMwVyzx_xMDFV1RNBM8.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ovcDrQd1Rf-rPqu61qCgBrEXGtzYGfwT0oAU6dOests.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-IgbNhrzrtxBSAp5JplZ5EaIQODphJR0T0opnmwebYDU.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-A08UC1bCnYlNNXJ0ugZNjjf11mhRECl4h_yQzod1TXY.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gQ_m01TP4lgRRJwu5W76LNuWGB2YunMe0VDxrurgKEM.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-FNKXVviuUqsGRBXbJPdT3wuuS9voj7LEC4aKdmz4YYU.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-mjnKvZi4ZRr9syJLwRnlfYFyAxrupgsOKrw9fhV1Tio.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-XEVYqb5SQi5LrE3rchrll1bTZEkSx4axrpTE9VUSXdQ.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-INAKIUnMDmWsGK4Yt1SoHxnG3uFRrVT5TtawnXgoWOA.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-wELHo4A-b7OXDtra6YKhwc7GuscCXdf3q8kmDmANM9Y.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-DzxzIQGHnSynK-0YR6towhuiPbZsnCIOPrWJogy4_EA.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4152512236796075404.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kd0Ure-hVYanf0FI0UsadsswG6mc57Yj7tr_fSaGShU.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CMJ7Qc8qtsSgpExHEcl3ZIErzIzoIWBRBuNFYNdE_ts.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-mQ9Bzo_scDTbXgyWo1zbFOqajsBLyxwmqBCxr23Hub8.jar
    Sep 08, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-DeD3EAYomF-eanN6XbmnB9Ohn1KRdGfMj8Q-EAb5b7U.jar
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-qmbIUZgPpjBpZ2W4A_wvq8jqikL1-tnDjkFF-3LXfJg.jar
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-F8uZv8xilKYOQwCsyzTltK9ebceVM-VigG2_SlC-by4.jar
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-vymT0WVQUn6MeczUxN9zbzwDVAjF--alnaMUDNtAKXY.jar
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 9255c45c5594dcbfd6487b5bb759fe2bd23682e6af4b7a9ff22142c860e059f5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-klXEXFWU3L_WSHtbt1n-K9I2guavS3qf8iFCyGDgWfU.pb
    Sep 08, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 08, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-07_17_45_22-15553383775998443989?project=apache-beam-testing
    Sep 08, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-07_17_45_22-15553383775998443989
    Sep 08, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-07_17_45_22-15553383775998443989
    Sep 08, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T00:45:22.225Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:29.659Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.357Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.403Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.437Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.507Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.541Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.576Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:30.613Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:31.191Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:45:31.263Z: Starting 5 workers in us-central1-a...
    Sep 08, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-08T00:45:47.789Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:46:02.187Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 08, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:46:02.219Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 08, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:46:07.583Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:46:23.689Z: Workers have started successfully.
    Sep 08, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:46:23.712Z: Workers have started successfully.
    Sep 08, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:47:01.616Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 08, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:47:01.758Z: Cleaning up.
    Sep 08, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:47:01.843Z: Stopping worker pool...
    Sep 08, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:47:51.827Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-08T00:47:51.878Z: Worker pool stopped.
    Sep 08, 2020 12:47:59 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-07_17_45_22-15553383775998443989 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e6976614-b487-4feb-98c0-3c9ae8469b63 and timestamp: 2020-09-08T00:47:59.229000000Z:
                     Metric:                    Value:
                   read_time                    19.893
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2020 12:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 51.338 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mzdpw4tv5pz3u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #967

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/967/display/redirect>

Changes:


------------------------------------------
[...truncated 295.16 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ceAJZP2RxohIdvqeIx8zERABdALooGChbY_8FbLm3tc.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-OYm_MULKt8Z_t2DjhH_a3GwO2ytnxv7sGBwKSqjeS-4.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-hC1PZcxL28WlBSOnvuutAcJfBKwU4-51PmwKMZaqOrk.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-JYmNy8HXBk6RPdZ-yeVhqb4E8Ouct3eAw0Gc7tLKx0A.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-7soNh-eeoOto6fVpy1YdgnQH8uvSCE4VnKU_8wwgMRY.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-v1u74IdyZIKv6rKddicyBFFi2fypr4NAv0QJbhJeR70.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests--hQioAVpDH6ticMuwrk00fGRJ920ZLB-QMCu7faC1VY.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-dFvfVLJEu_1v3MYFyYBJrOTLum6oxYM8aga7BN77vB4.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ChQzMucolGWXeHJ_Drlym36gq2fMQhknrylMFChmfYs.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-v5biouc0FevDpKJ7tng9xmcRKc3OvTypd6rO_IJTtJo.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-g753sR86GYnMeAoIr3H1s1FLFm4D5WQjBtjW9rRKits.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CxU-y3Y-WQ93eSc_CVt5TPduMbSCtfufM3kX92aUM5I.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RfkAE4sKyAPgJvLijbnf8M-iE1O9aKOawcL0HeuQDK0.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT--VYE2We9JUdZ8DQpz1JbOOIROBAj0mV4GORDO1WFnng.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-I8TkM1FD6aAEW8IR-haFTYQcXSFzChwUPT_0ICIj6gE.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--0QzRZaDAdlViHbX7WPpWCcja8CplhYPgonqvk25EZ0.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-NDCxtTO7_VQuf61WIXp755BdtR4twGezNqlh8mjHwro.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ceAJZP2RxohIdvqeIx8zERABdALooGChbY_8FbLm3tc.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-aNIHtn-TFDLIWkItjgbklctdlSh82zsdV-XJUwJop3Q.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-sTwOLq6gdd0FMWPv2Lqitx5FFxiB2T4BXfDSrsDKZtM.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ii95EX2Wbm_UVzoS290afvw_cv_piCpGvz83xcTZNhI.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-P9LvmZPrvzgzP9wVB3qes3zLjTwEQlVkaRu_o9A-RP4.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-qiIYC-we5N2_9SnTTc9ySAboJ44W-hKJNrlJ5IGSJK0.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-69Et95r6yHuNXcv8UIX3ygtn6ERc188HBHPManOf_TU.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3752900658203352513.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RHaSiDIpFxANQUajI0LHhH6aRcn2ixzVQlhhVnehfi4.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-0HwNOEBHoKSEePhKTrS1Vn6Brd8YhdwYoeByCzYiVEA.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-LVFWPpRePoNB1lf3nDAgtd8shWTUjcpgiYFBZoSmfCk.jar
    Sep 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-rHNA9R_PbiC3q3Naf9Lxi3sKzgDn43HxexKRH4Ua0BI.jar
    Sep 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Lr1ux6J_Kab_MS47PAKGSSlXJXsARAP57q3XGBwDKcw.jar
    Sep 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TV4pvlRl9S3Pa0Qs2i5Cx-8PFVThZ18tZBtB3obfCXk.jar
    Sep 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-6H2NRymYClTcBtAyT2C-OmQgfT7D_Qs9byT8uAjNJ3s.jar
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash df00f542525041dbc94919c975136d0e904fa121834e0f0137ae513111d2cd7c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3wD1QlJQQdvJSRnJdRNtDpBPoSGDTg8BN65RMRHSzXw.pb
    Sep 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-07_11_45_19-11150862791688056029?project=apache-beam-testing
    Sep 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-07_11_45_19-11150862791688056029
    Sep 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-07_11_45_19-11150862791688056029
    Sep 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T18:45:19.715Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.146Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.764Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.802Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.830Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.899Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.939Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.974Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 07, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:27.999Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 07, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:28.454Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:45:28.535Z: Starting 5 workers in us-central1-a...
    Sep 07, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T18:45:55.630Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:48:30.014Z: Workers have started successfully.
    Sep 07, 2020 6:48:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:48:32.624Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 07, 2020 6:48:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:48:32.654Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 07, 2020 6:48:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:48:38.383Z: Workers have started successfully.
    Sep 07, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:49:09.731Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 07, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:49:09.782Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 07, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:49:10.902Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:49:11.104Z: Cleaning up.
    Sep 07, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:49:11.251Z: Stopping worker pool...
    Sep 07, 2020 6:52:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:52:32.342Z: Autoscaling: Resized worker pool from 2 to 0.
    Sep 07, 2020 6:52:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T18:52:32.388Z: Worker pool stopped.
    Sep 07, 2020 6:52:39 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-07_11_45_19-11150862791688056029 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7db30e32-89f2-4116-ba91-30ba653d13c0 and timestamp: 2020-09-07T18:52:39.861000000Z:
                     Metric:                    Value:
                   read_time                    18.601
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 6:52:40 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 7 mins 34.353 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 23s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/anul2yj6wv2gc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #966

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/966/display/redirect>

Changes:


------------------------------------------
[...truncated 293.26 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 07, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 07, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 07, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HRYTrBddN5J5rLL1LTS0oB6DLAeMjFOsvpQFWKYthGo.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-1juof2fQ1XckFw35AseAnJ0BPjzlTgCRlzYW_yhsXU0.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Y_TNqWZi-ybnxqOiDSzwaZY7nrFRpm2dPa0HE5_PZjs.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gwXMQwT-sC1ad7UG-yJ5fFiYDqkias-unV1CLghsAkM.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-IJU9mxi4YgYIQzEeyA0sV8CwAvDvCopTmU0D04wOhLk.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-UVOQtdz4RTDYF-zIwDqbmj_3Sj5poHW0TNPTe0Q2om8.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-boMTyu-6_bqmg7n6d3Jn52rUFqr7y_XulS9KtHNFYks.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ozJuu1Jw2L3TpzDLsHtB2k7tAk-zgPswnKPfCrpXruk.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4858247812504520888.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lWMd6PdI29QArZqIY38EXEayKYT9HiMKI57dYKH3Y_c.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-9DQYVYV5nLKGYEzh7SUWIZEKv4n4y60nPEahEbOMDQU.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-oOzYX4X8fazXZHOT-06k7dlUO_irQj-KV7e0mU69A2M.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-erwsu4xaqezKRmC60LcNp_Yl3pA8_IB3srz35mavM9Y.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-_MgJS7hk_Ez1r2TrdQE10SeMs4fIwNUGP_TfBBu5ebI.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-tpa48beFpmSWgCbsafw-Iq3bFo2PNW2q2NI14OF-4Kk.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Ir4zJASyayI4JJVnjPFJY-XuiZRuF8-dH5Ulb8Cdcoc.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-CE2mim0ETpBDaZlOe1XMs0_bt4Lce9bhxe1QEwcQH48.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-HjM7_e4KFL382_EMCQZCBtlvfGgqqhKrpWCK2Wdvq6M.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-UL8d4by4KN8qgW1v1jSW7e6m2COYYxckFKNYFUvqrL0.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-s_qYxqPZsjNPw4n8pSU2YjQrLkKKx4eXC-9VyiL1aPg.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-lSrLOa0MDGiva7arqwIO9M4Ey64acqHFjMOvqV6-VS4.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HRYTrBddN5J5rLL1LTS0oB6DLAeMjFOsvpQFWKYthGo.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-438nWHCTxHMnMjuo2XkgU0CupQuM0PeUZRdovpBfjhA.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-x-d1HsMnY_pHBQKzWM78e3YXCHpvZBvJXycx5S-dt1k.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-iMHWTXSkQgRPMZ3yzThT8gfAcZaahNMy0sk8uSA9t48.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-IKcO2iUbqZRjA7Ev5pioW9a7I0b7fMjCg3lxU3Jg89s.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-ZwWV4VP2rJDRyFMdgtpSzBYxFD17N5wYxQuWjBQ1P1c.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-8_18x09ZhRp9R83XWTOF3Y3Lawz7Lz7feIl5H7hTCLc.jar
    Sep 07, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-75qHZtrmUl_Oi07hXVVHyhCUDZMPfzpD1fc5dHcVrSc.jar
    Sep 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-LMWYWfbzhSuCEfAinZkmwgbNYcSvcHvCF0vN2JVT2HY.jar
    Sep 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-QhkIftI6QCYRWZYUU_fCbKGDq1vA3iVaNTAZYZjGn5E.jar
    Sep 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-o34sVoGbKSK24PN_wbVf-Cif6jmVFfK7eXxUvY_3DHw.jar
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 2 seconds
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 3f96640612dc6ab81e0c54ebbfdaf672d8d0d259165c58f0b830dafbb75f94f5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-P5ZkBhLcargeDFTrv9r2ctjQ0lkWXFjwuDDa-7dflPU.pb
    Sep 07, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-07_05_45_30-16168137581901877777?project=apache-beam-testing
    Sep 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-07_05_45_30-16168137581901877777
    Sep 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-07_05_45_30-16168137581901877777
    Sep 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T12:45:30.993Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:38.119Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 07, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.046Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.086Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.112Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.232Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.264Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.301Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.335Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.762Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:45:39.838Z: Starting 5 workers in us-central1-f...
    Sep 07, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T12:45:54.812Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:46:03.848Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:46:20.716Z: Workers have started successfully.
    Sep 07, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:46:20.743Z: Workers have started successfully.
    Sep 07, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:46:54.198Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:46:54.346Z: Cleaning up.
    Sep 07, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:46:54.430Z: Stopping worker pool...
    Sep 07, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:47:37.372Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T12:47:37.411Z: Worker pool stopped.
    Sep 07, 2020 12:47:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-07_05_45_30-16168137581901877777 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 72a26916-4177-4dcc-9f0d-09583fe3fb3c and timestamp: 2020-09-07T12:47:45.652000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.311

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 12:47:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.04 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 33.66 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7o4l5n6uhtgle

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #965

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/965/display/redirect?page=changes>

Changes:

[SergiyKolesnikov] Fix BEAM-10661: Java quickstart using Gradle doesn't work


------------------------------------------
[...truncated 294.04 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pdS_8QQVThjPvAM2DqkihFpOvCLBlxnDWc5GYDLAiSE.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-v22qiMbnMjJ7fM4b58J9sE8dfUwoaQBqLBCRz_sa_EE.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-4PiNEAdaMP1fYO3MkI3A-E39l72souhfoajM-rfP5mY.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-LaDJdo-CeJw5uACySIKII20MQN-9WSqRb44F1qfb9hY.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qYdcolKeFmA_Ve1sxpiFU6zU-T39EPj3I-FDEwVUg1g.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-LqDcfMd1u9iT0cDsVb0cLgDPfqtt3yCCIycFISN6s8o.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-QWklY9QJad3Snhp4PYPy1_nDLTUEdApJ1RlNCGTtyWg.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-NSV24ngD5cIgctXbMDmNgk93gVCUCA7nvq3cMRYWM18.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-7ze_5Vcsa7IHoWSTk-OWtLMuYHu0pZ0VvPq8d2picBg.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-u4bF-3WwdfrMCZsrWLqLVm_WK-FUTnRW82mFdpR0RlU.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-sIJ8ojAdm2__Ga7GbG1oeZtpEk9NA4kWa3LChEBj7iI.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-EoGoGaCIhxBzIb_CiXMtTu_3u94M35SHgqRXzgJJb_k.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pdS_8QQVThjPvAM2DqkihFpOvCLBlxnDWc5GYDLAiSE.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2031881302758280474.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UtwZE-m3EbIK_yqLBJCauQhcXEcPo6wncqs42_XmnDg.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-d9WcpXqbVg8ZXhQvd5N9MoWlU1lc-UDGEcFgx0STX7U.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-zbNGWEII8RXdWkncnZZEuPjk4ditQrWy8gHGcBWatGc.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-BgAnBqEvxpTNO7MJ1RoMGBTnyLzwyRLc4D_qZzNs_Mw.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-mR9Kwcl79YjGlhPfsl3GdQv8cgIb5h4luzinRKIGINY.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests--DzQ0bgWs3w9X9VxYCtPQNsKR59I3HPgl-e9pU8_SUw.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4VNhhLoJBGsZjkurbmqYC9-R7G0RCaOBQGAeChGMG-Y.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT--mzhdiuPmlTdSiAdrJY4IG5AXc5OCHZd7OivIDSN-l8.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-dFY53lM1WtP6-c3l_1LbLvyeFKQI8xJnd4Orm-njtjc.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-trp1YB20JOrGUOU7z_EfGxSKsZn_NBnmvxvKGbWr4hs.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-r1Z5iJbX84gHlbRSXYhsuS-Lglp6ppc1fZK_QOjm760.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-7-IvF-SxC5nBwhgIQ87ga4ahYNP02ccEl57q-hTP9E0.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-1dDhbPgh9e0kbAd1MUBi_UVH3rrg7YG1jPu-Da_3L6o.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-SUI7CRM1ACLHGGf7sYevcXkQDQsvrNQ1P9jjCKs5xzg.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-pRIAU_mPAukQEI6dyBkSPbri3w8bfbVc8NZPoSt2hkw.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-02XUIFHr-UPHFkKh3esCB28rcLfPVR_wdVYDVWyOifs.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-kHDY5wGOf98G4ydY3D7lDn3iobz7C7guIhjukx82LrQ.jar
    Sep 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-l4J8m9Hp8N3GsS1GNwPGwq2t_m5lNlTcHPXv7fVHafI.jar
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 3526766f585a07dccd1c74944bf89db6b19e542194fb6a0519d303aed3258e51> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NSZ2b1haB9zNHHSUS_idtrGeVCGU-2oFGdMDrtMljlE.pb
    Sep 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-06_23_45_20-10776441524469835520?project=apache-beam-testing
    Sep 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-06_23_45_20-10776441524469835520
    Sep 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-06_23_45_20-10776441524469835520
    Sep 07, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T06:45:20.679Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:28.843Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.684Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.730Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.806Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.876Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.910Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.946Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:29.982Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:30.518Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:45:30.602Z: Starting 5 workers in us-central1-a...
    Sep 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T06:45:51.205Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:46:02.558Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 07, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:46:02.592Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 07, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:46:07.966Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:46:25.995Z: Workers have started successfully.
    Sep 07, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:46:26.034Z: Workers have started successfully.
    Sep 07, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:47:10.255Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:47:10.411Z: Cleaning up.
    Sep 07, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:47:10.502Z: Stopping worker pool...
    Sep 07, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:47:53.613Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T06:47:53.657Z: Worker pool stopped.
    Sep 07, 2020 6:48:03 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-06_23_45_20-10776441524469835520 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 028181a7-3264-4ee1-9cb5-f54b809c702e and timestamp: 2020-09-07T06:48:03.762000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    22.077

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 6:48:04 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 56.806 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/wnzorjgztcih4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #964

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/964/display/redirect>

Changes:


------------------------------------------
[...truncated 295.50 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 07, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zPiq0PMfRJyQPL_AHLtjdUjGVak1hRR_h8P4txWt2wk.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-s2GuEAt9s7uWmIErhBVhOkJdg8fpvAAwQz9OJJrnjVg.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-atYY3MWWlwss08d0osxcwxhEcAQ_4wFYR2CZYXPbSFc.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-onmooJ_L1c8WV18B0QymZ1lJLzJgtOMNmgYJynzOPRw.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-D8mDnKDZKoP8wofH7qQ73A6w4I5s489l-wThRR-tsR4.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-g5Q18lSGZlq4t9s1avRSwkZtIU4b7_zSpjxyjeek5dc.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Fm1UzsPCiwj5dO4enM4y74fop1Y6ovMP8ut1OltF6uA.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-1_iPylOa2YWafzzGjHYZikfNZjgeRDU91WA4BLv6Hak.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-N7TQNBxr4giwCAMjX-0rYkNr8MZbYPIA5UzcmVq_hig.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-9OV9RKoIYSmvxad21bSUGR8u2Jw5mZ0CXlp1YX1-m_M.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-TRTouE5t6te4eSyfa3_PC29KUU1Gf1cOkpb6suoeNnk.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-kEI_Jr8uLNuEYfa4U0Rn7MHtsiKLHItrJ14kx6EA7JI.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4252306150179746873.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7RRM7x3XYKtyZMAiB9ln5JxYbZYEp7V__Uw7fdM8NKE.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xVlwuNrLQhV2MYQ4g0Y7lgPPD-yUoV8yloRJwC4Lmhs.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-M96KEHJtUpRQ0EsiRgB7HhMqjWOUNnbLTkk2WtmVYMY.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-1XjnunuCnaIpAqTUgPjdbRL3rnIptSshJ-YYX8z6UT8.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-nsv0dNaYDrHUcxovglOBPF5Xh8bYZozqXw7JWNPNQz8.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-TIbj0M7ykSOpWyd2a_X6wwikqz9Ko3Dq19XpTxkYHsE.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-vMF9snjO4xJ4uJrJLj9AYYnoShMquZIey3Yywwhx8Ms.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qJxqcUnMEJjp-XXDkhEK3ITKa-xoDLhOX3yB7lSv5Uk.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-hPF1JmaDkte5UFqirsofR9R5NQHJrawvTZoVB8nJ5v0.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-MK6cC2TuEfLYb1aSHvv4er0cUapEam1r9cMwv3V3drs.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zPiq0PMfRJyQPL_AHLtjdUjGVak1hRR_h8P4txWt2wk.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-06o--OTNatAi8UmJ-zTFPqXbuml8aHjH7hGAbxDc_CU.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-rg3A7C7l9o7LtOvHUl6Bs4yhGZcVMTTctBqsd2mLHs8.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-eo_L3HG6T-TpbEsG8QOk-aPerc-N60F9bRm6AksjFP0.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-clKzf2EYKepP_L77yrxc-BsNAxLbhiz7har9VQbQagA.jar
    Sep 07, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-DF9HrY9xhpaCunLC9rB4kW2YKpB9fzy8vDOIfJeVPyI.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-jQh_VwDvIjC0wG_VJEwuMp3VOe3z1Vl9QsXXtX4LWC8.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-yw9YVyOYiMvZ5eV1ZIOUgzRddQfevN9xwKCRelGe6is.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-WTUbHqGaULKfnJq_9UGApLj_IBMk1GjSYuPe2sNQdrM.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.9.1/d313237180bf9f2f82e12f503d9617e6b070f792/mongo-java-driver-3.9.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.9.1-mxKxkvmYluxV-Hdn57uyt-MjjSQUsFjxFw9tjhx0bm4.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/4.0.0/9b3a11c613ec3fd3440af4103b12c3de82d38b6e/jna-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-4.0.0-2sJwtkQc4k2TqW3bbo-T2N8JkZJzh5mm9vz8KyQWyhk.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Sep 07, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 164 files cached, 46 files newly uploaded in 1 seconds
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 9caa22170f3c774e20812f238d48cda6c3364bc77391413b70007954f6ac0793> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-nKoiFw88d04ggS8jjUjNpsM2S8dzkUE7cAB5VPasB5M.pb
    Sep 07, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 07, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-06_17_45_21-6184934019227665815?project=apache-beam-testing
    Sep 07, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-06_17_45_21-6184934019227665815
    Sep 07, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-06_17_45_21-6184934019227665815
    Sep 07, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T00:45:21.933Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:30.703Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:31.973Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.027Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.064Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.150Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.203Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.244Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.274Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.777Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:45:32.862Z: Starting 5 workers in us-central1-a...
    Sep 07, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-07T00:46:03.623Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:46:06.933Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:46:35.667Z: Workers have started successfully.
    Sep 07, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:46:35.693Z: Workers have started successfully.
    Sep 07, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:47:15.583Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 07, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:47:15.838Z: Cleaning up.
    Sep 07, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:47:15.938Z: Stopping worker pool...
    Sep 07, 2020 12:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:48:16.288Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2020 12:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-07T00:48:16.329Z: Worker pool stopped.
    Sep 07, 2020 12:48:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-06_17_45_21-6184934019227665815 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d7b993e8-3a32-4e96-acf0-e46d2c913893 and timestamp: 2020-09-07T00:48:24.518000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     19.69

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2020 12:48:24 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 16.809 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 8s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/skutaglzcprbk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #963

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/963/display/redirect>

Changes:


------------------------------------------
[...truncated 293.64 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 06, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ird512r_NPmXVBgs477O-wqLrTF1Iv8kXVpwaOiBh4I.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-9vY8Ah-azMOBnmUYybmuE_htGq_beRbodX1gEVem9v8.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-c_GFpI4ys5acfvMa-oeV84Xko3pBY9u1mMD1YImY05c.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-cXrt-QbNpdUpDTb4oVmCnUrpd1rz3uL-0apDncjBAjw.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-hAL_3XLoO4O3TPfux6HxwaIMmcBi0L9h8oBz481A0iA.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-QlcMlgrNeCFC9Y52WC5VCRdOH5CtHq7oCfcR64qDTzg.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-twpuJ82nDarAUgmnaas6dpp7uxkzlOaBTKFWBIDqoTI.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-xhZgCwH8OXViFr2bS_qM_ueO2pZf4BAwoaCTRXVIWF4.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-SC1P0YTtYkRGAzC1Zg_345R9dLiJDj_wtih8biKkbAw.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-3XKtFXWNqekBk8lFzwSVjerohU1BxmuZL9DSEk4ZgfM.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-6OxZABxk4Mz1K4huE8py6xwkaFh1VedKPSFSeUv1YaY.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-0Kj7DbSQ7iiqbYvAVDbU5Mj04VLdJO7AWdZZeMGw0ZU.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-FfEXEm8RNYm2qXUz_tN_0ZZr5f-qfi3LPhUNxH_hlIw.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9w_iEtUM3JT6rXTQm4qU0biU2XcfoH3-1FuEp4-9iDs.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-LRQBi-saTVCIfSuWIYAVTHt3necmSWOmsURtIoA2B1A.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-z3rU0N3_NpJhiIJBuHG6rE1GmMYAS8ebD3tpRX6iICQ.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-_EDt6pPcHhCr58lyAXkPpg_M_rRhGf2hy5KZGk2w8yk.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-9DrXEyDMHzBKhkB9ty-QUDy9aXCl7bqBjwfRqWmRj4I.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-2ofjfSFl3xhhVvG_1iJBIYFsgw841YW9ly3hM0WFrtI.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-H8FuICFCZYbyYrGrlty1D3QPOxwH-thVlo4JU4itD-8.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ird512r_NPmXVBgs477O-wqLrTF1Iv8kXVpwaOiBh4I.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6926712855192242829.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FSBUis9QdMUNObHtzs9GDlDrq7NVZd-XeqWvI61ACKk.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QFF3g0EIpqgcGmUTUEQX9t7ps0gmNm3AIYQLFujRU4Q.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-diIZjE7kq_-a4ismh8vNsGmgOn7SiMurr3kRE4BgBTE.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-IwUGMjlZWDL-8kQ6uDuz2R7FImHqlhM5WvEk74ft3Kg.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8O93BBLHR20MGsBMYKQcsNfJf_M9WmHvFuUhytVLnrk.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-GexAC_xXNupd7ekAE-CE97VZka_aZEuhzaC8Hd-c_SQ.jar
    Sep 06, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-8Bz6sdHKi5RXhn0fpOudONjkl353BRgClgibb3-mKOo.jar
    Sep 06, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-SHPObDJwaXeKMu1kNcUtqW_D0YvJP8CkCImYqeM7xPY.jar
    Sep 06, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-iuWlcUIC0bYbnWEV9zQ1nDiUUrosr6GY7pmqHo_4Kzo.jar
    Sep 06, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-GJ91gzkYdLoLyH7fUfOLOs93x5bgnT9YJzG6BbfPQVo.jar
    Sep 06, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash d3156f33889707311c141f45cf79c03ba39648518144a4618fb5efe6a8a123f4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0xVvM4iXBzEcFB9Fz3nAO6OWSFGBRKRhj7Xv5qihI_Q.pb
    Sep 06, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-06_11_45_26-9547296841772676820?project=apache-beam-testing
    Sep 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-06_11_45_26-9547296841772676820
    Sep 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-06_11_45_26-9547296841772676820
    Sep 06, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T18:45:26.662Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:34.359Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.233Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.263Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.304Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.391Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.422Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.456Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:35.489Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:36.037Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:45:36.110Z: Starting 5 workers in us-central1-b...
    Sep 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T18:45:48.685Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:46:06.398Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:46:24.974Z: Workers have started successfully.
    Sep 06, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:46:25.004Z: Workers have started successfully.
    Sep 06, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:46:58.332Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:46:58.499Z: Cleaning up.
    Sep 06, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:46:58.568Z: Stopping worker pool...
    Sep 06, 2020 6:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:47:51.374Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2020 6:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T18:47:51.417Z: Worker pool stopped.
    Sep 06, 2020 6:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-06_11_45_26-9547296841772676820 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1db10017-5056-4473-96ca-c286b32621b5 and timestamp: 2020-09-06T18:47:58.858000000Z:
                     Metric:                    Value:
                   read_time                    17.294
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 6:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 48.465 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/et4lf6ouuz2a4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #962

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/962/display/redirect>

Changes:


------------------------------------------
[...truncated 293.66 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 06, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test639434466901278635.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tpY3vPIcNSyU4QFWLSAx9Qw_74OlrbA6GiDnTJa6nLQ.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zcaypBozoS_2iUsQW7wOsjj_pP-fD171g1gzrsbcAwQ.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-MYjSOaxjqAD8hurYbxAV7kmkUVRFF65NXG9KCZX5sWU.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-YhTHkss3k-0nNDLPUsGCFj8PltuMQQwj-d80zGf4E-A.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT--xcSFK4p9AWkCRevbX2Xel5-HA2tlhPN-pntG9_XuT8.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-JS3R7kq7jx9efDg8-574LOpXPNU6J3Ax7w1vcKysscs.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FblnWm9aTYvCnTnqLFBBkrnYcRP2MreHTbk1R_cRUqM.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ijX-NXXVv9YBBIUnJFWbEeufz206p7tyeFGDd_WdPfk.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Ms7UIurqvJ-oOZPmHIeWGSRYOnZi58qpNmXzYVs2Jag.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nGS32ju8ayNLdUSWIKg6u8WCut5UMH4kM2SlqwNUFUw.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-jhVLebnydJFilhlWcGZNEf2KJCwLOqM8vGyBebuYeGU.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-uTfBRkKobNa6M_Q6v7NbrIB_mDoSsMJlN7ZLRjVTjMY.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-dUxxnbfeF9DpYfao0byeq785tyHQkMYmFyih2OAlPLQ.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JdPjEh36ay2dk4EJxF3ou_fahO8O-1pSuToZdQdGph4.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-1cjbEVNZydo_kGdsdtLCkD19AaPcCO9jFT3OvVQt6vQ.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zcaypBozoS_2iUsQW7wOsjj_pP-fD171g1gzrsbcAwQ.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-bIvuqOTOFauk-j_VEi2O52WrnHtZPEW5kZbh9s2N7ho.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-a-lJu9Gwar-XdO5huUSBi4TZRCjFgXJW4Bf5scDKkoE.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-LP_mfu-dYVYNo2fgRgIVG9Cca0J_xoFnigeFPWuBlEM.jar
    Sep 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9M3ezeWwp_UDJ-RdMGbMkxV6bTxV23zRz_AzfUqInmk.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-VbrHxeywmTJsTc9bi4VIqyhyeq7LP3yVSSNgL7YcKxU.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-oca7pqbmHNsKaMrDm79ouj1Sn0mCy8IdFwCUFvMg6EY.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ST4z2C3usGN63RpnMRe8-y9Ac3A3jYNw5mTK76dD6F0.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-S7eFDkw-OuTVKLdYnPrz37huZ9_58zWorHP6HewClHI.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-tsS5et2ap0yvbHLU80_Db15SawyX978oezB2eei2ZeM.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-ZV41rtnkshlpLEh495ND8_OXr20Sr_KZXyhS-yHzv38.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-LmPqciHJU1-O7XwFa0PvZZgb0Egu3-npbjzKWugAiQo.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-aSj_QvQuHZTnnE4dWKxh--BC6UAYHHOxhBuAUF5XRQg.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-UR6z8gXIu100EnmJhXBCgIrind99yxsFrcb_zyJQmb0.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-OgZ_V0YTpvPbOb8cgxvHAnUG0m8FGqyccr5bCsAtAmI.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-46JFy4Hq_vKhJBnmhdJLEd52oNdRoff4mNDmgzuiBF4.jar
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 06, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 338e322fa578b85ecbdf5891acbee23321567bb4331aa507526f1315fde8c952> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-M44yL6V4uF7L31iRrL7iMyFWe7QzGqUHUm8TFf3oyVI.pb
    Sep 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-06_05_45_22-14648270370074897105?project=apache-beam-testing
    Sep 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-06_05_45_22-14648270370074897105
    Sep 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-06_05_45_22-14648270370074897105
    Sep 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T12:45:22.356Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:29.827Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.621Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.664Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.692Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.768Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.794Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.827Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 06, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:30.861Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:31.222Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:45:31.297Z: Starting 5 workers in us-central1-a...
    Sep 06, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T12:45:41.435Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:03.536Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:03.568Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 06, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:09.050Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:09.084Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 06, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:35.714Z: Workers have started successfully.
    Sep 06, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:35.739Z: Workers have started successfully.
    Sep 06, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:46:36.175Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:47:11.233Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:47:11.374Z: Cleaning up.
    Sep 06, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:47:11.447Z: Stopping worker pool...
    Sep 06, 2020 12:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:48:05.995Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2020 12:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T12:48:06.041Z: Worker pool stopped.
    Sep 06, 2020 12:48:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-06_05_45_22-14648270370074897105 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cec20d5b-ab89-4f85-bead-3105c847b22f and timestamp: 2020-09-06T12:48:13.444000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.687

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 12:48:13 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 4.24 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fhp6eb2pxbea6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #961

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/961/display/redirect>

Changes:


------------------------------------------
[...truncated 293.03 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 06, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-DfzPcw7Dvze5GJnZaTQwZ7Kh2jr59UPc3_RpHCsl53k.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-wOR6Q4_uiyxk_YCZqSG2PmNIEtHKeFDxpw0g1goth78.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test59729296074576836.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0DT581HqHMbNRACBuAR2jRYajWbHGOSEIKp5QA7fYyU.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-mHvCTjO7VTjwJ8ipEAsINe-AC0zzae_B9GGbCfPbOJY.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-P-6qxRkCzXdOEnB6gfjLBuxszYkqmc9smPBSabYAiMs.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Yd3mJ7RWXIRDL3z1nc2A0P7DE3gbs77rLeyPT3uEYqk.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OXwVp77UMuo4tqSO4WLlbfLkeVbPcsN9VTtR-RFjMxI.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-32AVATot_akq2WcFBwOAYb7OABzaFxjS0YTWuFSXMDg.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-k2li3V7ntCzTf-szfYWEMXMGR5vGaYu5v1sfwYfj2nk.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-5-ltzEIT7P6nocOpuGaHwnWY5DreRDzsAE2dJVyGnlY.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-qoiNimAp27TdiL8179RxS-iOg4-9H4Hd4Ynv8RyHpqs.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-E5EYF5cO8FMm0P7OPMBIQ0OEbUpJued-u3jXekKA3G0.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-3wDagIdmuJVN48_XuGAjZcIvj_s74ORla8p66Ewo_dM.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-RDauE__48blFWaHPaIiFvYKFq1KNqhakAOwSIgjmiKU.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-VDQ-VEHJzziTSmEtVJospKh_WxO7cGWBzFYJGDK77UE.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tvNzg-Wply6BOQmCmePvWhnZOiNqbImnBA-aAJycYW8.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-51_DBdGcv59tLQ-y-bzFcXui7QFq-XvLQ3-9Z29I-aM.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-g5oGJ3iJ614MfPojzabWCHMm0RyFxP9q-9kaoQdMc_8.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-TDyWBMHxgZvQFUrs_IlhfdJ8ZAIKVn0vmV5Xc5yWNf8.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-zy1ILTMeHBENsgOM5ffa05BAapkBJfaF0Vq1dDqNgt8.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-viQvfuBTEPjnura4yrah5fLDEqTx3kwcTNiWNXQOVaM.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-spTOKGl2scTCVsutOKJsHecfyqb5EamVdYYeuWyHB98.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-gWqbXPNWoS1jH89_B1xXAnmwq8WkXnClm_5Uebp6mTY.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-2dMSObru7o0vs1ndNR4Dv5p-sBny8EMss3_mA1Xxa4w.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Y9ghTjdWJZpEhtWvnJ4-qiUiNnJ8E2g5UYRgnEDWpYc.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-DfzPcw7Dvze5GJnZaTQwZ7Kh2jr59UPc3_RpHCsl53k.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-_5kVgvYERyjY1hZFK1Rh8XYcN8mIyh0-_CIf_VWkTf0.jar
    Sep 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-3_uJpAy4ecsZHwkUWB894SxZ3c_T6VJ1-TbDCVWkaU0.jar
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-RXViuHBbWlXdfoLUoh8ZykdjR_GfrHFinZhBhg8M9Wg.jar
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-VUetdEiyHDrRuj1a6YsRhlUgbO9LqSjZzNTwWSgoSpg.jar
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-KrMbFyzji0vF7O_NwzL6QBPNbTdqANJvxLrNFBlcde8.jar
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92058 bytes, hash a00e180eb8390a6fa5943c88eb2463b12c50f8ec587a21d14a0a3a82ba658a90> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oA4YDrg5Cm-llDyI6yRjsSxQ-OxYeiHRSgo6grplipA.pb
    Sep 06, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 06, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_23_45_21-8482739232909992081?project=apache-beam-testing
    Sep 06, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-05_23_45_21-8482739232909992081
    Sep 06, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-05_23_45_21-8482739232909992081
    Sep 06, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T06:45:21.058Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.092Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.745Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.785Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.817Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.876Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.906Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.938Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:28.962Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:29.301Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:45:29.363Z: Starting 5 workers in us-central1-a...
    Sep 06, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T06:45:59.889Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:46:00.799Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:46:30.568Z: Workers have started successfully.
    Sep 06, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:46:30.593Z: Workers have started successfully.
    Sep 06, 2020 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:47:06.455Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:47:06.612Z: Cleaning up.
    Sep 06, 2020 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:47:06.699Z: Stopping worker pool...
    Sep 06, 2020 6:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:47:59.024Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2020 6:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T06:47:59.063Z: Worker pool stopped.
    Sep 06, 2020 6:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-05_23_45_21-8482739232909992081 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6ec3c910-78c4-445b-8a33-3c1c69a400df and timestamp: 2020-09-06T06:48:07.723000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.271

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 6:48:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 0.542 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/x5ghc7ktooewe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #960

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/960/display/redirect>

Changes:


------------------------------------------
[...truncated 294.71 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 06, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 06, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 06, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bmN-A6e4DOSm-qj-g9PEviCPToxrA9BGwUl0OmWt744.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-A6jH4Ynh0QK2OeMiBcpIAj2KRKs6EklYVm-E3LrWG_w.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uJgLXvjWDw6F1tAhRKACgOTFCPIfdDx-HEGvNfq9bDU.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-y2KiuyKv_RnVK0lrk2pZBsc_nASfHtoNxnuvqDdWg1s.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-xZDeSJYgcbojDPx8eUDkAYhaCQa7ISJvjpEzpbOq9nE.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2872663332240809749.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RmBI5Z2U4LHsfLzaBfOGZ6RON4V89VVUxvjrqadWgY8.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7g-SIIb_jrBq_bdNCOophFBrbEu07KBDKOellsBA7hY.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-amxVcJI1wlNlj7n_BG30nekEgKpbYfx9RIqZJNtu3PQ.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-NrC01Mqsv2MWmUdqdEj6VMRhwvQRLN5fxlw8rcDLzkg.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-JA-KoYemvULsArAENwDw3D5LJEnKhUMO7OGL73aDFjM.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-sP4DDA7OOD_N024jF_Jis-MX1Tyu2TX-EKWcIIMPGOo.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-9N0hxFNcjXOWovCtJFhYJrOV4D0QbFSvnUN8FtB4ud4.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-yVc8YWkQzF1kfuPOC01v3JlS-sWVIoaXcyY-XkPAZpg.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-l_84y3XIxvOCqBm6_LI94g3xKyUwLdyj9j4RUdwJPSM.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZPhaGjo5hBW5zsrmA6G9DndaP0meyubKy0i6MZlEmdo.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-GXoQUreTAfgUqhIHlw1xABTAfSYsSpVla_P3TiN20Q8.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-wf0RWcbKoyX6OQwgZF5br7zgxD1m2A4demb7hVkEILw.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-LgzFZ9aTBE_yvx3wiASNceZ0WPQD9-Jn7pQxGs7BU18.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-G4vKtaAzRgd35IEQC6a5uiOmEODO7A5ClMqVt5oLXJc.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-8xHa0m-VnboZWQYeq8F6YJpO2mz2HcjjQlkzsEnWcTc.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-JhthiBTl9Wx68jEmH0KgK-Sj4LlDy24ZvD-M0PmSqf0.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-4y_JV2Cgd1HJuVfk1f2SAMHnh05OCtpGocRXadF2T-0.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bmN-A6e4DOSm-qj-g9PEviCPToxrA9BGwUl0OmWt744.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-NsXl1qdC18Qe3jppGLypQp_djWh1csqzG-sH84DMeO4.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-wpVseNASJxiSNzNWKaH3S3zGu0UeBHB7dUFcdC4Rj1s.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-s85ATB45yC9XufPeoRBvYuF12GI2YKbfWiJrVy2thjY.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Q0EAnT4Oy5dJ5RirmdcRM7RTdT8GBUIPx5BTVVkuLQE.jar
    Sep 06, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-SXvfX9tZMbuJbfgbh7lW-hKGdtO6h2T3fDntuDPxYH4.jar
    Sep 06, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-GMThGpakPQ5AtTXrTYXJxE3oU8xvjEVfkiEel7seCWA.jar
    Sep 06, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1DXTSGEmlkaTYPnByiN_f4bxUq-Q-kVacJEiRpQhafg.jar
    Sep 06, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-rrWWLYBKHoolGdwn2I4LnPojx96FDQzAF1m3XdmENSw.jar
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 10 seconds
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 37a187bc92eb7c799bc559f5d0392764158a5d238142546610240de66e8830f2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-N6GHvJLrfHmbxVn10DknZBWKXSOBQlRmECQN5m6IMPI.pb
    Sep 06, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 06, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_17_45_33-14071785706633266679?project=apache-beam-testing
    Sep 06, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-05_17_45_33-14071785706633266679
    Sep 06, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-05_17_45_33-14071785706633266679
    Sep 06, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T00:45:33.764Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:41.190Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.107Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.170Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.207Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.274Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.309Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.341Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 06, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.377Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 06, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.785Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:45:42.866Z: Starting 5 workers in us-central1-a...
    Sep 06, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:46:09.789Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:46:09.823Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 06, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-06T00:46:13.828Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:46:15.224Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:46:37.757Z: Workers have started successfully.
    Sep 06, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:46:37.781Z: Workers have started successfully.
    Sep 06, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:47:12.298Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 06, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:47:12.459Z: Cleaning up.
    Sep 06, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:47:12.541Z: Stopping worker pool...
    Sep 06, 2020 12:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:48:05.207Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2020 12:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-06T00:48:05.253Z: Worker pool stopped.
    Sep 06, 2020 12:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-05_17_45_33-14071785706633266679 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d6674a2a-f6fd-426e-a06b-502128dff75c and timestamp: 2020-09-06T00:48:13.671000000Z:
                     Metric:                    Value:
                   read_time                     13.61
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2020 12:48:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 4.011 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 55s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/umkkkg5hhhbbi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #959

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/959/display/redirect>

Changes:


------------------------------------------
[...truncated 294.72 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 05, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-wXOo1HmbK1RWECSRCRRxVZCaT_8UyDuc51Cb6iNsRtw.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-d5_ep3D_6NEEEvAAjhPOxBG8BXVABkIpPE986PfCzzA.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4TbhZ-p4Akz_d99wEmZVbns2z8bbGYfOZ2zQKe9ipXo.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_k9QqrzJErU5wNDaYx_Gf7Sv6JEtbFsjNUMIDtMhZCw.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-bXZ8A_TOICY8p4OHYg3s1potU2OnCldq6W_GI36TS60.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-wXOo1HmbK1RWECSRCRRxVZCaT_8UyDuc51Cb6iNsRtw.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-IqE2nVuH7ZZb8qESNv1TF4iCeAEqJiMZkrvzRsoToT8.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-nneCKmv5QTtpZrNmJNZdvE12uk6gm0VhWHG3remVheM.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rRhixo-jfm7d6qAZ6CkrPzE2UDnf_UspW2aDGQUe5yk.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-T9rlcqxdWHOGeFeawNikYuF43KLo_IRPYTf4iWKYHaE.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-bsaFX1jMDEqxNTdw3nLLcx_4Y52bKDH_2iI7h3euyAU.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-199mOzcKAXPDF2hTlSycyJ2gQruoXtdxucd_qPEKJuI.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-RmPNdYBcn1rZswPaTZX5hatrZUvUTq2WDPZGKm9gdMQ.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-zEY5iSUDcEnBShe_thx8757WoeLeG3eoC1kK9XamrBQ.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OQ2ufBDVz8FlMfnd6-E7hwDzPoT3oC0RLxwvgjNxQyU.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-vKmBERJwrGhWTXYCDjIzIQiiN7b4T7E0a4JFbPaSy1I.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-E0wB9-jsk5hSYq3bLwKYp--K5130Agrd5eDWlthWUKI.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-B_klsnIG-TVMrgGKUGOoMcwqgIrd7PMaA3wOjQy97OU.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-rYWIZlUOBgp0QS0KZZj81pa0t3txNA2WtEa2c9SjURk.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-6cax_iEUCmXnY-dVaLuANCLJpI7IbFw99JjQEwb2Wew.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-5MEN7AxkveP5FmtCeMqXJ1XFC3eWPvJM2oZIfHTwscg.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-cRvyp-bMxF0KgzEoRJZ3-tU1qhzjhFS9slGVGbM0TOU.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2704965924328978874.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vyzzWmb0G7sWkUVRkL7OdvqSvnWUxpSZMZPzgtEyxVk.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-fUt_zwPgkHzsExbWPoar3LUe4AnKXpcEokJjv2TYS_Y.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-3tAOtBLMTu3yBX4tKX-i-0lxo0GFAUMlMdsJTEXwevI.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-uHVYHfWdlaHTFJhdzxCO2-EYv6s4dGYLVwNUz5csNCI.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-C4yXtYFSkC92GhvJsx-ximlBPYOHzYrXaCXnRch1XaM.jar
    Sep 05, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-sEjxC45AN-2VshsIIa-lGL1NnQRxe-MopQJzjtannXg.jar
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-FndGylu539DFO82uEosm6LVsA-_l6TSdlX30tVa_zaw.jar
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-T94ZQnBzDWyBxkUfy-KOx49KxZfqdbFEGSz5e3jPKQM.jar
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-NAgHMx9j4h6pWis-ONTgnebcwpHypc5VuiXHo0c-BP8.jar
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash a64da9351836fe2fc045b1245851a5ccd08bbd4e117287187d08c7a989ebe61d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pk2pNRg2_i_ARbEkWFGlzNCLvU4RcocYfQjHqYnr5h0.pb
    Sep 05, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 05, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_11_45_19-7322304045136004885?project=apache-beam-testing
    Sep 05, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-05_11_45_19-7322304045136004885
    Sep 05, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-05_11_45_19-7322304045136004885
    Sep 05, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T18:45:19.253Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:27.303Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 05, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.392Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.433Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.469Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.546Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.563Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.593Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.623Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:28.998Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:29.076Z: Starting 5 workers in us-central1-a...
    Sep 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T18:45:36.254Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:45:57.098Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:46:20.870Z: Workers have started successfully.
    Sep 05, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:46:20.919Z: Workers have started successfully.
    Sep 05, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:47:00.017Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:47:00.155Z: Cleaning up.
    Sep 05, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:47:00.244Z: Stopping worker pool...
    Sep 05, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:48:06.929Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T18:48:06.972Z: Worker pool stopped.
    Sep 05, 2020 6:48:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-05_11_45_19-7322304045136004885 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 66caf2f5-5026-4317-a8b6-3747c0a9f70f and timestamp: 2020-09-05T18:48:14.198000000Z:
                     Metric:                    Value:
                   read_time                    16.986
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 6:48:14 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 8.424 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/vhobzkpqnrenc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #958

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/958/display/redirect>

Changes:


------------------------------------------
[...truncated 294.82 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 05, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 05, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-7vQh-f_yOIRsv-8Kr7EAifwZy23nijJ2d57OQ7Ong2I.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-T_CcWCnt2il7rPIQcWJH9xB6LZ3-6vg3HNaqm5WWhzM.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-P0z5I-C7zBiCUgL65lmmVLBr9JGAHWb1H8MjZXPjKd0.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-9lytZcqdSUaQgm9QSybCQE-WRF_rvmF_x6EpfG-R8uw.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-O5vtxNM8ohcXrhDwZjxBI1XxwT9mcmrL6S_SBKNI77g.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-7vQh-f_yOIRsv-8Kr7EAifwZy23nijJ2d57OQ7Ong2I.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-GjQFzrrEplrLcZRXYX96zfXinvEu_NAugLciVSNXkS0.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-2E3VFgeaVMSy93XPT7REwc2lv1SEO0fOzqMrsXl6AYE.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-w9MFs460y4-Lz1C1jmgVmocjDlGDk0VWTNcoeiuat2w.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-EBsXqerjOj49a99Yhr2_cxE_ViPqlR8hXNmMV3A4Lm8.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-BPK4Wq7dyR0n5uaojIOQsC2o1mJLJPu9VOX_8yeXmjc.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-BFKTQoFYbM9kZJW0MC0pTRDMEvcHHJsKd2GTu_NU-kY.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-2sa_dXS6KDdlCXa6sGMNOPZNGkNtm68xV4MK2R_AdSA.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-jyne-53n3KgSpRljPKSYNljCn9vDNV3HcIQyTC1N06E.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-GdDd8BKzweoOGr8FVGvg1J7JJkXuHQTcwvsZlU-f_PQ.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Ck--wwyAia-cuMuxKYP7fLj9hEsReReku5q-NtQSX0c.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-FY69yUU-0ioedkLbDVo_UamNOulP4JwR-BtJgC5Wg30.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-NBVxLGn74Koo6TMCSwFrOu7QvlLIhXiV_bmFjm37aZI.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-3SiTV3uOEnO8T8Pf77OE_0mxpTv2yLGjNLOKsBqhYPg.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3501862738524055203.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JkyZmAyGQ_bd6pC5Lv69P3FiYqVAzb7oAMn4NhWqDN8.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-diTWG5zUQRi-2ZiRyv3CkWz2dpIjZdT6rdCd64n6vi4.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-hDZkSjqS8QZxPcYbgFB1m4yzgakblt3rfTQKp183Ohk.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-FNMlew5CNtyXD4OqTq4g9XLp6oav0vyOBMXWUrT-4dk.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-G0_NHvBJhEjQrIQeRelbXqwfCh9Yxp1WFTiT7PnesK0.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-51blAJta48nH38L_GZ1AcQjHlwVyRsVmiAF-W6ifSeI.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4V-rQLdDGysmbDRIwIFx9DPVraKpMhD3GXXOhZbTlyU.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-aNV5i04uqqqMPzt3663XCAIIXy6Y4UuwlsDzWRmS6-w.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-klwLqvk1zAlN6S1uydOWiZKdA_7zK08mOwcwM_qJv3w.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-vcnzwpdKRBKre4u-7qggTe3Ic4MTJ558iSrIarlJ9So.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-7_4tY2xKtzzYXP6xOz151KQcYaTP7pBgeuNxi0qLBkg.jar
    Sep 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-kpTJCL1XguzqACF3r0FJDrKKkEJjlMlbXMMRd1WbIls.jar
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 3cab39bb3721ab28dd10680e7823f25ec5348ab58cd4ae7a5d24c4c1d3dccdd6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PKs5uzchqyjdEGgOeCPyXsU0irWM1K56XSTEwdPczdY.pb
    Sep 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_05_45_18-2779723688874921461?project=apache-beam-testing
    Sep 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-05_05_45_18-2779723688874921461
    Sep 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-05_05_45_18-2779723688874921461
    Sep 05, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T12:45:18.746Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:25.895Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.563Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.606Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.654Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.732Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.770Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.807Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:26.841Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:27.192Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:27.271Z: Starting 5 workers in us-central1-a...
    Sep 05, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T12:45:44.571Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:54.432Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 05, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:54.468Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 05, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:45:59.746Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:46:23.806Z: Workers have started successfully.
    Sep 05, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:46:23.839Z: Workers have started successfully.
    Sep 05, 2020 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:47:06.047Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:47:06.221Z: Cleaning up.
    Sep 05, 2020 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:47:06.300Z: Stopping worker pool...
    Sep 05, 2020 12:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:48:07.935Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2020 12:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T12:48:07.984Z: Worker pool stopped.
    Sep 05, 2020 12:48:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-05_05_45_18-2779723688874921461 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8863decf-e707-4491-a393-7086b9466643 and timestamp: 2020-09-05T12:48:16.207000000Z:
                     Metric:                    Value:
                   read_time                    20.208
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 12:48:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 10.383 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ak2dp2b7m3ilc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #957

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/957/display/redirect>

Changes:


------------------------------------------
[...truncated 293.75 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 05, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-oHlbAkOSAEOLfn4MLtLg-04lHsAOlaFTY51YEV7cMlo.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-4AY3-sgZQuxoZ8qVIvCHhkt11UojFykDrB7Ssn-8z54.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-wQGbgcfiwNe-SkEEoqQ0KnrBRBMgMDMbbFQZVhTzeno.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-2bsGV3Bn8QCBTpHAW1g2tfEVZBpmN6emtTJODDD9MIY.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-DP445obKU838Tm9rb1lNmgNQiUofNpJ4hpEV5wO8jZo.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-oHlbAkOSAEOLfn4MLtLg-04lHsAOlaFTY51YEV7cMlo.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-_Y-Q5F0yjYEUIRWxd0BDmmQoRx6wmR_9EkOCzf73zKo.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-njmVKY2NpnU-agDlcgWexQSNU3TdVhtg_KCGSJCLVTI.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Gf8sLSQppq4TSXz5EJ_b1zBWWZnVdGgi_ccJBQDisNg.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-_vIqiq3LHKEEVR7g--pBn3PcF7eMXBOuTvIvTLGyvqI.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-qgUrZHUpVvf1iBGwzLrYfkMcjhDCwKjoH5QRYA65RUU.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-S1ki6Wp-n6p9_gJEXs33gi6XlHadIHHKWxE3nlbWj5I.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test402302737695451412.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Sni6BxQHQIskOh-x87wZ7e5cXgcMawNinoHSdYX00rw.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-csJXaUtRsWnqIOKXjTmSJ-I8AIdYrYovGEa5UILfhnQ.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-HpZYFJyaYXra7KGV1RCpwOfSoQFRGx4y7mtVQxG83ZQ.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-__5Xt6Iu7RZXpIXcn60-koirs0dMXgS4mjVX_SrMgiw.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-AANvAHmz_INCeE6jQyNNkyBfrWRwEpfVgXcagVqcS08.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-zGVpJ35tFft_osV1Sq14uq5Z-TZW3w8nEt0TKC9k1DU.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-A0o7C0XSduhRirC8PM-F058AZOT-e12EUwXEwI5cDRE.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-HdcsTZRySAC6UMbUM06h9UB69kDGoBT6aH6PsP2EYMc.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-hs-X_DzEch3RfPLV5PlKqMKe7LdgF0-QUWp_ByUGZJU.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-vXcNmwtoCDJj8gaJaC_N-edUcLV6jhbcXSyITewidN4.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-6GGeqd1aunJEgH_qvrb_4N3gKTWcG_DTt0pfYubccAs.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-K8pelwqi3QgULGWlkaDK6VkPzmoUYBx3-NuB19Bsh5Y.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-kiWChYZOj4JQ15-d3DSjoWFoJSP2v9YpfwLA57MBybo.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-gVhHbRQjuqRl9NjKYAXNFIRBE7SK1BHNI5Mw2UMmuKY.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-fmugKI4_MfMLuokhKnqdXeQpQ-rkG-An9qEGxxTbIwU.jar
    Sep 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-cxKf8-7pfZ_zDpN7CpVEpTHUGdFKMUJ0hUXIzi51F7g.jar
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-OYNkAsnaIl_wM9xKZAtQKaTVjVDo8N9LRdmdi77Z8CQ.jar
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-wAfAkJ6MsEa0agIqOtz8VaJUbOvgTW5PT_r5Wg0HDd4.jar
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-83SpsZ09elDWRQrmj6Cff39XfTPRwGy57FChjxX4Tgs.jar
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash c5c0c74cd0daa4a4c0a3401779087fc158dbe34878c3eb80e7cd082f595d4af7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xcDHTNDapKTAo0AXeQh_wVjb40h4w-uA580IL1ldSvc.pb
    Sep 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-04_23_45_19-7297811146727605575?project=apache-beam-testing
    Sep 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-04_23_45_19-7297811146727605575
    Sep 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-04_23_45_19-7297811146727605575
    Sep 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T06:45:19.036Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:27.319Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.038Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.078Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.133Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.201Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.240Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.282Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.322Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.843Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:28.950Z: Starting 5 workers in us-central1-a...
    Sep 05, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T06:45:38.347Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:45:57.144Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:46:23.194Z: Workers have started successfully.
    Sep 05, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:46:23.223Z: Workers have started successfully.
    Sep 05, 2020 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:47:00.953Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:47:01.104Z: Cleaning up.
    Sep 05, 2020 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:47:01.205Z: Stopping worker pool...
    Sep 05, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:47:53.995Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T06:47:54.043Z: Worker pool stopped.
    Sep 05, 2020 6:48:03 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-04_23_45_19-7297811146727605575 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b4b5aca3-6949-4db5-bfc9-f25aeb3690e8 and timestamp: 2020-09-05T06:48:03.547000000Z:
                     Metric:                    Value:
                   read_time                    14.542
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 6:48:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 57.984 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/dmsr7dekf43wa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #956

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/956/display/redirect>

Changes:


------------------------------------------
[...truncated 293.69 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 05, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-xKkKoM-f5tLvFzAanc-9qUjEFDmIbjInnYvx7UePr8k.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4697629330266660241.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lgm5ADZWKeA9lB8F7jxu2sHa8cYbun6NpEFPc9uu2P0.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-kd4XkGdpcp1TsAxjBcBDkgC9KHHgJQG7BXYHW2ZlvQI.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-SygokpBYsjnIea4I9WUFj7njcbecJ_4Jq6LUXa4LJC8.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-3izWYG7VMXCiiyDWWDTQpsjloaaRjQguRu68_lC5NI0.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-S5_LLxoAPuGrFmEHeixKx9PTIL4nLwIJQQ5nbDsTOXQ.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-u_ddqpFN_Akz6koWtB9N9ke4fD-lIYcceV-DFXWd8iY.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-DkLHLk3WoVQ1drxUdvdg7Rsq7r1TBKOk7RI-_XIvvo0.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-SIVydb8YtvTn8guR1zN7MnWGSA1j9dOO_jr5ZUuctuU.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-98zWxpMl1iC507hWIJr14q9Jzoc8Hj_ZdhWjsZTg8U8.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-iLvCJ5H27USBuYumnWFeOBcVkYlrZZ-qsT4ZSFFAsTA.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-MerHZ9qyldokFpaT3ANE0KQL6xgJPkweBAmEoRT3I0E.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-yrfcmCQBha3PSdyFcLDTJmPnHfWexkAQx5L7TJOJqEg.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-oXN3konzl1o_GjQQmNjwSTu1NQ60QLescTz84g_nfL8.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-wOP7nSsRvKrXhpX7iWh8OCabvLlx5aLK0GUst6zvpxw.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-RFZtLcJFWVjXJ3y1ZAhkvO70V6HCSL9Hs-EmJRzEtP4.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-9ZgKlszu_LhjZj36QmcFvtWgcx_ThSaijXffYARPWJg.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-9CSf0RzKi3c6G7KzSWtPtexHJBbwhYTwfRVMNC4vuR4.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-mucefgp3bPRvl8XbKqWCgNDzt3SiNuzWtxAnKqmQFH4.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-xKkKoM-f5tLvFzAanc-9qUjEFDmIbjInnYvx7UePr8k.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-thGoANVIjnjd8V57RnYGS2wy_Mrwya5B9eObojCM1to.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-p5FGankQCIEtHquJXKbnaW3iKhxZ_MrRiLgA10rgqZI.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-sTghlbRkwbQzfJ7L3ZwoaYFXQ2gBK3w7DB54OBWuAT4.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4cpCOhYi3ClosSGo-T9LFjKGag-ss7a6EY5mKZ_05ek.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Zj64IFmwYyn390ofO4ftNuYHIL5IfQ3FOQuTEZdx90I.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-e9phA88iSeoM2TS_SJ9nHFboi_GLM9DX6zHOGmBP-ac.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-829QPlfaP0YjQwI8q7iAMzU3lbexvL7-6FQ_oLxFkR0.jar
    Sep 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-MyoP4v-FKfaA-nkApfY865ILXNFZx2rqiA1gzGLcCs0.jar
    Sep 05, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-VmrOYxPP6rwe-FgOJAwoY-K5_rbc060Nwc84C5OOp84.jar
    Sep 05, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Fgk3riCx7yKZ7qMLL6VxZvNtswuFzG6KXtbbsWtQnI0.jar
    Sep 05, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-C2im8hdu-CFLw5UjW_Nh6zhXD7HdXalznkeoucPn6Aw.jar
    Sep 05, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 05, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 7ae7f87547a87544da3f5d6fd996d06193df06e74bf1e14b247c3b5ced0ebcfd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-euf4dUeodUTaP11v2ZbQYZPfBudL8eFLJHw7XO0OvP0.pb
    Sep 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 05, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-04_17_45_21-2456890729197545683?project=apache-beam-testing
    Sep 05, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-04_17_45_21-2456890729197545683
    Sep 05, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-04_17_45_21-2456890729197545683
    Sep 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T00:45:21.423Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:30.100Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.032Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.081Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.114Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.175Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.202Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.229Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.259Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.753Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:45:31.808Z: Starting 5 workers in us-central1-a...
    Sep 05, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-05T00:46:00.183Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:46:03.756Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:46:34.991Z: Workers have started successfully.
    Sep 05, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:46:35.020Z: Workers have started successfully.
    Sep 05, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:47:08.321Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 05, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:47:08.518Z: Cleaning up.
    Sep 05, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:47:08.631Z: Stopping worker pool...
    Sep 05, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:48:01.859Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-05T00:48:01.912Z: Worker pool stopped.
    Sep 05, 2020 12:48:10 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-04_17_45_21-2456890729197545683 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fc02c525-8967-42f6-82cf-b9a111b6d44e and timestamp: 2020-09-05T00:48:10.711000000Z:
                     Metric:                    Value:
                   read_time                     11.04
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2020 12:48:11 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 3.228 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/juvds5nvyond2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #955

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/955/display/redirect>

Changes:


------------------------------------------
[...truncated 294.66 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 04, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-_dvqALjUi3OnMtgGiog7uOamizIWDtrOcxS26gtWlCk.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ugZc7nbY8watV5CTQ6LDBETQUZ9_x84QFSyF1ft013s.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-HdkdRPUoS2mwQg8ei0EjCww7l01DEsMKrlSFl2JBceY.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-IQKZ2UBpeza1hU0Ci1G_OyTdb750T6WTrbkZC3wfuvs.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ugZc7nbY8watV5CTQ6LDBETQUZ9_x84QFSyF1ft013s.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GBcyUfu7-C_DjkoPbwrNom4EDMYdQYRe2GZ-HMtBCjI.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-cd8XU3q2gElYSKL9nPqt0-M8AP_RW-C1WZDWuhNsxRU.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-dEH4PEWf08eQvmtYiqLyJFR8IYD8VZxG_oKNJzvyPIQ.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-G6RQOrY7TCPPvN0wqiOOAR1hS6dCkxu4HkhPYizzi20.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-lea5yE0iS1GKkFJUWkfsyLt7IDAqP1S31WJS5BZAnx4.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-H04WCrx6ZhFrWv_FXBPD87O6kl8AqUiMh6odLMeA-0g.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-xJUDbELm94YbZdHE_fCcqy2sYW1mcbXM7V3cDiexD9E.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-FlxZNht0C4PdOCtAfy_jv6o-5peWSzmZJiRHmq-wE-o.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-WvZZMWk957eK4YYu15kuI-Cy8N7gbliOjJvjVEqIUdQ.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-KztnQmWDaIn4OACmosJzlS-suEX11bogGDugvVHa_uk.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-tjAfqrOlx9BIIi11DAZcTIcOKEXxB_3Legjrdq9fJ60.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-HLR0Ui3CqE5kJHX_ri3aQH-NqL3DMrU3WNwhPrzQ5HM.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-iUkIvbWMiyHQhLVfm-sgleFYwtYIzM582qCwU8KUjGI.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-sdL7kyVGcRcCY-rzoNOSloMQ6X95pAWXuy6JwiV6Fhs.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test526451819451299432.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-A4KiyzRNbSOLwBQgvejJYcxpfkZ1yeGgtAG7Ky2RdrI.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests--V0u4llVaazaJpneEqF2pIHQgv03uXwFqk48B5oUHho.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kZUOaNeirtwQMb6z6iyrvHKYb5niE0GVBxulvXfPgCo.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-bYoe1oCNl-XUMIyePQ09DvmDueVELHVcX1NZYvpUBaQ.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-9EqCvobqSODmPe8mbWKwJjgcgrYYfgcXKH2cpzE7mIQ.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-wXoGNUQO4VCPZJ9QasjSK5Sz934pP69Vc0CC-rx91AU.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ECIUD65k1-hZsKfsV-CreLnjM6Axlt7ROrAwwcg8BBM.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT--DMLSrx9dB7gfc4vWr_djF7tEQ2yzL0UKLSPwyJDZOQ.jar
    Sep 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-nEiEkA0nJFxL5_eLAKffZVM9ufR2566ZP0WrZ_lcoQY.jar
    Sep 04, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-tibr_PTMh4-sbEgc60-2ehWjIB4Ab3nxsLMeGcQj0AE.jar
    Sep 04, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-p0UUHal5vtcyJUbSgqyPybvSanXc0cuFTzcsLCV31X4.jar
    Sep 04, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-3TYXAXzOr1vOtbju6Zbdwn1x50eKFvKYiGp4OK-wLaE.jar
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 2 seconds
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 4c5a4b59edbc1b78473416a94e5d91cd1c4de2c05d01775493fde57e500d2d3c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-TFpLWe28G3hHNBapTl2RzRxN4sBdAXdUk_3lflANLTw.pb
    Sep 04, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 04, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-04_11_45_24-8459306588590518728?project=apache-beam-testing
    Sep 04, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-04_11_45_24-8459306588590518728
    Sep 04, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-04_11_45_24-8459306588590518728
    Sep 04, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T18:45:24.117Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:32.558Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 04, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.214Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.332Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.369Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.451Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.478Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.528Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:33.564Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:34.014Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:45:34.090Z: Starting 5 workers in us-central1-b...
    Sep 04, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T18:45:55.031Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:01.166Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:01.222Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 04, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:06.568Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:22.524Z: Workers have started successfully.
    Sep 04, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:22.554Z: Workers have started successfully.
    Sep 04, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:58.246Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:58.411Z: Cleaning up.
    Sep 04, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:46:58.519Z: Stopping worker pool...
    Sep 04, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:47:44.497Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T18:47:44.660Z: Worker pool stopped.
    Sep 04, 2020 6:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-04_11_45_24-8459306588590518728 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 32e09d40-becd-4aea-ba95-0704c6c4fa85 and timestamp: 2020-09-04T18:47:52.849000000Z:
                     Metric:                    Value:
                   read_time                    16.845
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 6:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 43.523 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/deudiuwfy7nbk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #954

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/954/display/redirect>

Changes:


------------------------------------------
[...truncated 295.10 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 12:46:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2020 12:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 04, 2020 12:46:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2020 12:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2020 12:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-NL4dMTJZp5KLcG_wh0feSL2X2YUNGcyDV_BVL-mAgBI.jar
    Sep 04, 2020 12:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-zrZMfAAgNwKd5TxYbE17MyhSMlBwzxVDR4en04Tc7Bs.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-rzrhvK6mMxHr3ynQSXoaFYm70-xN-gjFuQnNfkrmG6E.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-s3WjigbwZXBaYFXFnQxXFfWzvheKQFiAG51kjhuFQYQ.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-z5AMF2puKEBu6dh1KJ7pjh0AzlU_Lv_s1ZgvFk2Knes.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-NL4dMTJZp5KLcG_wh0feSL2X2YUNGcyDV_BVL-mAgBI.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1409532274238865258.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PcvfZQPtDMMZk-FgzvDhmawTUwT03zzPlgTMBOipOFE.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-JcmnLKVzG95MBTxoyYLgLmCq7kmcYXO6ehts83p3-_g.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-NAyKaUNqkYMnkGQCLZ_RI1uI7zXXPpbo1s243-AgHFo.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-VKEqsqNDxqWie1D6SXXIGLTylvFKO8yaWTo2V7ijMZo.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-LEl8mYFIQr2J1mIaj3bfWhw0IMe5rfRAUJBe__T-Fb8.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-8k4D2zC416AXK-XS3RAneTKu8UQjBGCcDoeZcv8RAGc.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-AszMEoSEjvE4AC9TcMsRBUVtetyoiB_MGp-Z6e_g3dc.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hjLL0CuFqSEbeFPHo12DKm0_K79cCsKjKe017XDVXw0.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-JWkF1C2TbL8IwGpnaiO51jN0W2bynDsLEyoMHMiOhLw.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-5jyCDJAmyqo-cZpT2JUhOTn_mYkiCtl2y8gQP0sGoIg.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-sJAj3-6Dr54MPEjD9_jQ6qlhWTqaM28_2RlfVlrGWQQ.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-M0clu90uA4_qNNb0Tl3fBxGqvN7EP_rbPHTljETnv08.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-TBBDSnVz4RmxdKHH4pB0dk82T_536Ofi4LyEZpD-tDg.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-P7y7VFTKFaPQhlpYLxd4n5ufb_U4RFvatKvP90ugQq8.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-H8NUgmFRoTK8_ieLyJZWzOZLw9W1nw3-QrXp_v8TTL4.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-QiF3VmywKQ1pTfF5EE_2YQ6L_7RotfhowwuM2R2-tuE.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-iipiuTf9HEeav_bhlYhkbDSa-TYVC6xXQLbmTi32m-0.jar
    Sep 04, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-FB9vlPhZPkFRZSPVggLBiqqIYY9oNt5fyiaeZgYnAyk.jar
    Sep 04, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-1pgNNJMwe8G3fcQFhbnK_yeWDpS0w287sjf0u7UxQ20.jar
    Sep 04, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-eb5G22NYjbhAhJGVR7MerxCFSmwu-1fQvPwhQyYU9tg.jar
    Sep 04, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-kxfrJo0Ob6UEFqUXei9QR5xL-kasQslxXo5m0CwCwUA.jar
    Sep 04, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-SBsCshtLwLODNcJMR2olrZY2pygewC7a4mRbcF97H4k.jar
    Sep 04, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-CLKTelcTV4mXrJ3enpj7eTO9W3VmWUQ9dZ6rj3ghTIU.jar
    Sep 04, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-X7ggO8GE1ejL4yIKTfxcTRHVBpc4exetMJfzbj7oGFY.jar
    Sep 04, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-LhruWq8iJDrLzeWlCifK3FjEIjAOt2lLkfJxKFBUABs.jar
    Sep 04, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 6 seconds
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 97f2b441d7be2f99de6bfaf40a3a7a44012c51fe22a8b57a62e073821db313e3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-l_K0Qde-L5nea_r0Cjp6RAEsUf4iqLV6YuBzgh2zE-M.pb
    Sep 04, 2020 12:46:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-04_05_46_54-8425372716159203116?project=apache-beam-testing
    Sep 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-04_05_46_54-8425372716159203116
    Sep 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-04_05_46_54-8425372716159203116
    Sep 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T12:46:54.121Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:04.027Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:04.970Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.007Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.054Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.146Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.180Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.219Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 04, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.264Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 04, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.833Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:05.935Z: Starting 5 workers in us-central1-a...
    Sep 04, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T12:47:11.832Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2020 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:34.657Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 12:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:34.693Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 04, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:47:40.002Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 12:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:48:08.396Z: Workers have started successfully.
    Sep 04, 2020 12:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:48:08.443Z: Workers have started successfully.
    Sep 04, 2020 12:48:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:48:48.062Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 12:48:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:48:48.335Z: Cleaning up.
    Sep 04, 2020 12:48:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:48:48.430Z: Stopping worker pool...
    Sep 04, 2020 12:49:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:49:47.642Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2020 12:49:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T12:49:47.702Z: Worker pool stopped.
    Sep 04, 2020 12:49:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-04_05_46_54-8425372716159203116 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 71193d7e-5c70-46f1-849e-c9c922e6aaec and timestamp: 2020-09-04T12:49:58.155000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.138

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 12:49:58 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.076 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 40.189 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 26s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fsabd3w4l3nos

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #953

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/953/display/redirect>

Changes:


------------------------------------------
[...truncated 295.45 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 04, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-wEhB8g48rkvYAt74gfEy1vLBUBXPlrcRhiYCllyzj-0.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-pHwFHVziQF4rLpapB37KPcIHPXOKjFiFBLz6YUxqzXQ.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-MzgCAqJx_mZTlfP0A_PS3UyQ_oesfvoq92i4lg715JU.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-xPiXOHHEUKgpNN6OSKNnQR0tkbF3CDv6u1EMZFZgsYc.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-wNoXNs7dTBt6ePvXDMvrsCVjaL3qPKtTmF7_0Qidgxk.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-My4wMyRFic7-K5LiLZEqOJrGc2fk251vSBs3CG-z8ik.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CjfICdmhbg-V2rXFShtBcJmNkdlvXMiVZjlDKbtKHv8.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-C0yWbHoqEhrIZtzGeVEniHMCmOOGWGrXyyS19nWi9XI.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-nUfjKrkIovuh5bk4feuvHuuCxS9Aia31EIx6ErbMsmU.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-A64j0MSFa9iQduhbgIIj5UmBOxUj_7A6oM2-D0rvXeE.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-5wvtGGNtxVBZM9d2rsSVmy2_8EKrEhhTgZbT8irksI4.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-wEhB8g48rkvYAt74gfEy1vLBUBXPlrcRhiYCllyzj-0.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-_nrK-AKAZdZnsT1cqDbmRBiBqE7TjZA6BQ9jkmiCYMg.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-7kqII_iC4CC7UTQyf4G-F3PiNrdbp342BN5R0_By6f8.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Zd1Q6bE21BNux5ymmnSW-3the_kl7-05nicQtDaWFPg.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-iDUy85bM3yicONoe4PIYgWK87EsSx8s87uH8pqntOcg.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3778433319039810591.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-C7dQuswpjsbEIpYaLnj1QB5UZ0hnzWRg-d6imBSKDCM.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-JfrYaUMz2ZnCdgNZO95bDZ5kZd4G2q3bJYeUkcgwf0U.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-l4K-uGay3kjq_zPtHX5-RE9PpHYZB5Kj08UM7HX8rNs.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-M0UiFMxB7wxFf2tEQjUo_c0EIWm9zZWvQJiNUriYd60.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-gZT0MY2ugJ9OXuzElQNidFqtdF3O6a2_Ltp6S4KO-uU.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-_RlEpMvsm9rHnATtIM_qHv1N6A7WVCFrWf2mGSoBQdk.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-31PVZLhXCZfXe_wxpEpV7d_lMrscQLoN98zg3f-NTwY.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-SYQcP8vNufpiwGa2MDZTOi0t2uLo9AwWdpNQVhKk4tQ.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-2jGrqC7if3PwEUJxIoKPLgwVbC5OX4k4yKXTOApbdoA.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-PrW90VyXnitHVqjm-SkWqWZ-NgHw7n5_HGgGu4X0_Bk.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-lmT19_hknd3uxUXP__K9TSyCPoWYqEZHd5pc4q-gDR8.jar
    Sep 04, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ZJKUy8yKyv80Tz4JNrGrXmp-0fLMI2mEBZfG-ZGtahc.jar
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-5ylvfo5aFmQMo0YJp_-LZ65II62luU0fs6okL6Kv3ss.jar
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-CtbGa7DtlbkZtBCFErJysBS2UyeqNtom0CPG4K1BJNk.jar
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-X57Oak-5AFvc66SGOYfKeWmEXK7lUpWxn8xiXTGVyE0.jar
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 04, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 46d1001e971e6272d3d6e90c05025d9fcfac852d85442703d89f44dc7d6436f3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-RtEAHpceYnLT1ukMBQJdn8-shS2FRCcD2J9E3H1kNvM.pb
    Sep 04, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 04, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-03_23_45_38-14037458262807185953?project=apache-beam-testing
    Sep 04, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-03_23_45_38-14037458262807185953
    Sep 04, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-03_23_45_38-14037458262807185953
    Sep 04, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T06:45:38.375Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:45.849Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.426Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.473Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.512Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.593Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.629Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.662Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:46.702Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:47.159Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:45:47.241Z: Starting 5 workers in us-central1-a...
    Sep 04, 2020 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T06:46:03.383Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:46:14.232Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:46:14.262Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 04, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:46:19.570Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:46:44.992Z: Workers have started successfully.
    Sep 04, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:46:45.029Z: Workers have started successfully.
    Sep 04, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:47:20.117Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:47:20.273Z: Cleaning up.
    Sep 04, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:47:20.360Z: Stopping worker pool...
    Sep 04, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:48:13.636Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T06:48:13.680Z: Worker pool stopped.
    Sep 04, 2020 6:48:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-03_23_45_38-14037458262807185953 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 38ae3b8e-cfbd-4d16-a75b-aefd96ac7912 and timestamp: 2020-09-04T06:48:22.523000000Z:
                     Metric:                    Value:
                   read_time                    12.953
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 6:48:22 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 57.752 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 5s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/lz7v35q2m6r7c

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #952

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/952/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10849] Test that default Dataflow region option is set.

[noreply] [Beam-4379] Make ParquetIO read splittable (#12223)

[noreply] Identify pair-with-none DoFn in CombineGlobally with a named URN.

[Kyle Weaver] [BEAM-10849] Refactor to fix DataflowRunnerTest mocks.

[noreply] Merge pull request #12761 from [BEAM-10853] Fixing copy name issue:

[noreply] Merge pull request #12618 from [BEAM-10669] Add support for Dataflow

[noreply] [BEAM-10615] Eliminate nullability errors from

[noreply] [BEAM-10720] Add implementation for elementwise str methods (#12749)

[noreply] Badge fixes in README and PR Template (#12769)

[noreply] [BEAM-5757] Add ElasticsearchIO: delete document support (#12670)

[noreply] More logging for missing next work index. (#12718)


------------------------------------------
[...truncated 298.24 KB...]
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2020 12:45:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 04, 2020 12:45:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Q3z8VSLoSVOhhPgDEpdkVGnTPbiXw7SUlPt3CjopIis.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-2nUWIKn_Vrkh5gLTSglD3Zytkw4lY9ItQTRMyjRH9gU.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-bbBKMgSZg6H5q6c6R-Ui8ETT93Bvw98EBNr_nM_thII.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Ue0K8dY2vq-bS3V2nH45l7S9RWpE89ao5-cTIZWg07Y.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-btmk8CCS8UtPLZ95UdNKNVMQ6CwJbFqqgee73tx1olc.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-4F_P9tOPYPIGFOguWwgzdrUX8w3aPZBxhub3nSZVkME.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-h-q4LSmda7IaC9XQDRpKny94Ikk9f4zPx01nBMZSKb8.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GUARFztG2gfqfW6UQ0QTGWePOmpXZItCNbKaX7l7aR0.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Q3z8VSLoSVOhhPgDEpdkVGnTPbiXw7SUlPt3CjopIis.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-y9hBhyFHLx4Q757DQZ95KoXLVs5-IFV8aVkH2dFiXk8.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8120536183838451263.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NkNg5eo78g52AMD3XSFoKgmqFw125_ESUbpv7mxJagU.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--yv52PW1WcyfwiQre90BeJi1Po8cORcUp3TpKNu4c4U.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-iUqBPa6GBpamzS78VZ0-BC20-pFB_5ww0RHB6jNzFlg.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Ce6kGjLWnMNTpyqxpTvTZ8imZCSwHGbakev6zUiG2p0.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-A60UfNROU0qr-2NxVe-pNCtd6SoPOaV8e8R86lLKiU8.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-1_B3wj_KAiB3iFKXBRrs5usuuxSpw0VRdpZCvCul_Zw.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-fdJbsSpfJRYJVf9uwyFbabbS-kxIlr_1ZEWr1vvr6ew.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-OAvXK-RpAKVoF-EiwAlQTnDG5gmv5Gha3xoD_tqEsko.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-I1DwE69wYUsKADxbHKa_IjcRT37nWgdR0CxCplBCdfg.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-mhf5Z8ekAQ2SYNv6wkbFWPew19at2eKPL4S4uMCGIqY.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-1eI4UasSB6WFt62O5E77R44l-lkFbLWjtyCj6_OqmiM.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-qLlRvF5LV23CC58csL-bGWElSEdgHEr9R5K5ue3LGDQ.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Sntz2ibMKeUsSjgy7wIJF6brmVfy7OB-rn65lsMJP_w.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-dlq0QVCX5KH7YH4e3E-N_cIfc2jwI6H0gR4SzmRZfV4.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-nfgb4BcwagpKqZvbGj298kIUlJyr25YmNifcDcTMzuI.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-dXtyBUIsNT42hrwBzo_79uBZw3vkbur4r4SOKDUx3u4.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-dWGxLwzepbyuaV0dJWe4CLJq7G7tMxeq3uZOAg6LwzE.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-vRKnJ1Kox49hOPV_LDl4_mtEEwfGKO_oaZnSduIbtVc.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-tga-lS0n_4ETUtbKylqsdctkjPxcKzQjWNRVK1fynIA.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Fj26YqR0vnJK41-bmuoA-iggtDvTKBdqlv8eoRVqbZE.jar
    Sep 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-uI0Bcjv7wdt0kwhDeNn9JICkadwXr7vl8ljSywxpWzY.jar
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 6840be72e2e8cec661cd3f06b05929119225148278cb730d41bdd50e8a1c497e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aEC-cuLozsZhzT8GsFkpEZIlFIJ4y3MNQb3VDoocSX4.pb
    Sep 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 04, 2020 12:45:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-03_17_45_56-4260775302805086488?project=apache-beam-testing
    Sep 04, 2020 12:45:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-03_17_45_56-4260775302805086488
    Sep 04, 2020 12:45:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-03_17_45_56-4260775302805086488
    Sep 04, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T00:45:56.658Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:05.292Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:05.943Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.001Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.055Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.147Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.182Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.207Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.242Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.652Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:06.730Z: Starting 5 workers in us-central1-a...
    Sep 04, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-04T00:46:11.316Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:40.528Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:40.561Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 04, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:45.933Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:45.970Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 04, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:51.446Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:58.415Z: Workers have started successfully.
    Sep 04, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:46:58.451Z: Workers have started successfully.
    Sep 04, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:47:35.047Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 04, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:47:35.290Z: Cleaning up.
    Sep 04, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:47:35.383Z: Stopping worker pool...
    Sep 04, 2020 12:48:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:48:30.247Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2020 12:48:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-04T00:48:30.287Z: Worker pool stopped.
    Sep 04, 2020 12:48:38 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-03_17_45_56-4260775302805086488 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 52411feb-6f59-4911-9663-b60a3d25c787 and timestamp: 2020-09-04T00:48:38.284000000Z:
                     Metric:                    Value:
                   read_time                    14.444
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2020 12:48:38 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 56.217 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 20s
106 actionable tasks: 74 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/o6j43i2fqj4x6

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #951

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/951/display/redirect?page=changes>

Changes:

[Omkar_Deshpande] [BEAM-10829] pass kafka header to producer.send


------------------------------------------
[...truncated 293.98 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 03, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-WWRY67q0t81OjBYkwFCEm7dujC_ffg5CrGh2OSVZZT0.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-rzUij8GoEt5P72x7Tdv2pycuzjDWuysHZBs4tJtWlXQ.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-R9-0q2CyM7F_F1lvPs4mFsh-TgJZ5W7SQv-y54cnl0w.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-8bLogvErIBoDpe3cU19hqxdPM0LZrAA542gZPnNDrrg.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-spB0ek1J2xu17s6G4bJjefdYvpr-PUmKeiSd5kzUvEc.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-yHi4HJSr9b_Q2Y3S8euuy6CV53FJ9IO-_mIn49Kcf9I.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-62cOtQjCTJTgC5ogSwFk4ViMsJR7u_drf6o2jyoNdFU.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-lPICyoHJuOh5Ai_vlfcJt87TB2ykMvM3AGCkk44zKwI.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Eljvi5YMGFRIYnSqrkKIBCXj7mrgQuA5SSYNZjF1zBs.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3680889446539450451.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PpGH9_9RK06ItgSYwc2XGOx76glK6s2Uc_8bauemfnA.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-UUlWSPg1ldImMJAgo2LOJ2S8Khexf1tr6DVE4zknHl8.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-1LhRmB7jaXOT6I7hEAXEvLO_OFx_uwi-F3yvwiOhlew.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-B9Aj-rL5CB0SIPewnadZ4KAB5pJo1y9B7oKGiPUSLdo.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Y24xXy3sPUosZNV_-mU7zc_5EiZ-sxbFzCX6WZFsBUE.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-xmcGPC-lhrc5f0Loki_WfnkWeNdXmT605a9fQe9WIFM.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-WWRY67q0t81OjBYkwFCEm7dujC_ffg5CrGh2OSVZZT0.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ykTE83oRgyZK66AS2c9gI2jpYoB1RTLZuN5I39v7NQY.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-TtDbxw8lR1W3qbszx89UV61T-AYYwwq0wZT2c0E3-j8.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-E3qtDLn59og7OFO9QNP9oAGg5o_CoDOxqhZKZAC6h1M.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-kXqPNaKTvPsXFLp5jY_rbIEkq3cXjPW9gar9x5Qb2Iw.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-l7kd5_Q8vzMJdiqoXbTBcAvGpFJK_QoTOSOpU0tra8s.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-UA2GfsI1rDlkdPURghex-s7kpJFsNk1iiR1KOZ-w2UY.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-DFGBZZPNYex9KvruxiKveO65uFT3CatbAqy-z9wRV_I.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-mLYXDq2UWAXK2GCUKxzedi-Z19oHmXNL9j02WjRxyf4.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-ZIA8saye9B8Lfb-dgUZ_nDFGDAYifzcC7B9spGHgpf4.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-4ggbRk5LTXtQp56hgp9_qX0DofbehssMjSWNX2PXLaU.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-65v1CeiiQjczw3wEEL302YPaTHnsZ1jaSMctHf4p3Fo.jar
    Sep 03, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-aB7N5qiG5w1V3KC_ySAylgrJCLziuS3Iav3_7qughS4.jar
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-SXnaf9DHaBdvQqVfdnBMcfaXKP-Uru7BgNDiXKWHVKQ.jar
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-wgXaJsnJtUDr0AOHlJQRPxpIUVkFHs0ysxpXntZucMQ.jar
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-iHx3BdKOg5wejeK_K-_nSDzNNvo2FerVatrwF78XHcM.jar
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 03, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92061 bytes, hash 0f522ef0417a8176db3f28c7b6d6d184f9bcb5308e81dc3b1e51ef88942a6811> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-D1Iu8EF6gXbbPyjHttbRhPm8tTCOgdw7HlHviJQqaBE.pb
    Sep 03, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 03, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-03_11_45_22-13943276162814593401?project=apache-beam-testing
    Sep 03, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-03_11_45_22-13943276162814593401
    Sep 03, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-03_11_45_22-13943276162814593401
    Sep 03, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T18:45:22.446Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:30.661Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:31.923Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:31.966Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.005Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.100Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.129Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.159Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.191Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 03, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.577Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:45:32.657Z: Starting 5 workers in us-central1-b...
    Sep 03, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T18:45:54.502Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:46:10.168Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:46:39.302Z: Workers have started successfully.
    Sep 03, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:46:39.335Z: Workers have started successfully.
    Sep 03, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:47:24.619Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:47:24.748Z: Cleaning up.
    Sep 03, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:47:24.836Z: Stopping worker pool...
    Sep 03, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:48:29.102Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T18:48:29.154Z: Worker pool stopped.
    Sep 03, 2020 6:48:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-03_11_45_22-13943276162814593401 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 73f9130f-a022-4a36-ad3a-26e321e326e2 and timestamp: 2020-09-03T18:48:37.042000000Z:
                     Metric:                    Value:
                   read_time                     20.64
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 6:48:37 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 3 mins 30.537 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 21s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/adwwvwxl5paba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #950

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/950/display/redirect>

Changes:


------------------------------------------
[...truncated 294.02 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 03, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 12:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2020 12:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 03, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pPf_X7cTKfnJjKn0E8HtqtPJI-eJtLglW_mCaYdvhqQ.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-9i7fc_RYztpMk3g-803LtXDCpXo0gAtbAhtqnQin7zE.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tbZWUfuSNtTg4Tobrk_GKp6SqTxBqid9NwzRgpQsQMs.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-mQlFS5waJ1-ZnekivlJWYHLlIV2EmnB8BsP0Fs7ORhM.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-dm7aDBRRoifUfQaTK0MM3vyNbxpbkrRcYGZ1lltBqP4.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-BvsDAgt-p81KArbcA5rSwhmh9nCTgK8oEpi5pNvUdxQ.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-pw_yAlu6sBTsr2znmMrr3_Yfvwrp5dk1IKp9NehtcGA.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-S28yFVk__qU7fzDy9POjFMCmOIRcOBdnAB3hSdOMBtE.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-OlVF1iEm7t64ITNfvnRcooSW4_KvBQOQKSDkoOl7uBY.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-iiQvSkNdXG0jC4Xx0s7fGPovHpRgdolrjsr-ckWlIWI.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-odEgg34MrMhCf-RArOOgWtVFnMhNtKl-yVDXurqirss.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-T4YWU7papG2L7aRYTI5tkNCqEjOGoBMgYcoGPWJa2eE.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-FHhZl7BwDPJxHFCzRA5gMz93JxqfSAJ4oGm_QLaetnA.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-hJ7ZL0RZA0qO53MjV-6jma_uZnnhS4_Ja4c-5BgeEgk.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-RiGzqU93BHq8XRd0u9jTiPSu-Pms4-r6ytahbbfZntE.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-AFY-elJ9Q6f1y0QQYUqmrDGMHdxp1i16WjSCQY0rdNU.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-kdXvXuBnruOeh9YhO9W0q7ZV6u_aa-48eAozWX3bbUk.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5020972479356339541.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EqdbopbMkW5F30HDGaPQlspRoojDc1g6agmEUQ7Ba0o.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pPf_X7cTKfnJjKn0E8HtqtPJI-eJtLglW_mCaYdvhqQ.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-TdoZrDpJLcrqMyQPY4hwd93SURJBu3OTiJb3K_ZiKrY.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-xTseBSPrD3xb5kTP7O-gJzuOjikxkLyVBqJghMF3wME.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-EChVPERr4DB3gPMP7JYmPuvTkzSYDoqg6EeWVggQSvU.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Ggt2MylJutwomHzz2BgHNGXxHUxRpBEAD0UKJwDLfeo.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-9dXXjhVURSql3zxqjQbV1co6NGroEXsXm2P_qxAJyL0.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ogpq0FnDO5M65d2lmnHA91LoKWKeF6lNbKHF0WOrfx8.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-g3JWpppI9shXOef-I0gOUT14IFd5we3juhPnzm_2rgI.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-eeONId7YKdlOX7tO2JPNHF8kN61UOmhaM_GTKd_kX4k.jar
    Sep 03, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-TA-0E2QaLM5gOR0CxiutPgpPspcvtWl9lUC5n8Fe_50.jar
    Sep 03, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-u70Ca7aCI7iSZK319UuNhtsMqYxyqYeqJGPQP-oYhTA.jar
    Sep 03, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-tRbf5Uy1GeuRN1qdU75ayhghlSEl8TuoqjEqTF353wg.jar
    Sep 03, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-KBxSgaNqjEzXAVQypbDaeEhPQUkzt9PG4NNzWjTWOKQ.jar
    Sep 03, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 6 seconds
    Sep 03, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 03, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 03, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 03, 2020 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 2dc1dbb42ac2e6dccd3249c20c1de1177b1950465dcfc84316bbeb6eef092edf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LcHbtCrC5tzNMknCDB3hF3sZUEZdz8hDFrvrbu8JLt8.pb
    Sep 03, 2020 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-03_05_45_50-6979677857769739842?project=apache-beam-testing
    Sep 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-03_05_45_50-6979677857769739842
    Sep 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-03_05_45_50-6979677857769739842
    Sep 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T12:45:50.535Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:02.026Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.443Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.489Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.522Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.608Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.645Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.680Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:03.749Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:04.199Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:04.287Z: Starting 5 workers in us-central1-f...
    Sep 03, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T12:46:12.255Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:46:32.784Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:47:12.652Z: Workers have started successfully.
    Sep 03, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:47:12.684Z: Workers have started successfully.
    Sep 03, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:47:51.025Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:47:51.193Z: Cleaning up.
    Sep 03, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:47:51.323Z: Stopping worker pool...
    Sep 03, 2020 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:48:42.702Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2020 12:48:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T12:48:42.750Z: Worker pool stopped.
    Sep 03, 2020 12:48:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-03_05_45_50-6979677857769739842 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fe04b8a9-c14c-455d-8f24-027172f02fb4 and timestamp: 2020-09-03T12:48:54.765000000Z:
                     Metric:                    Value:
                   read_time                    15.959
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 12:48:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.049 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 29.204 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 36s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/y753y3uzq7dak

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #949

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/949/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10567] Add LICENSE for pbr (#12765)


------------------------------------------
[...truncated 294.78 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 03, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2020 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 03, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-n-NY-xOFpGkXbwqVfKOOaCa7VgjGNtTqQpAPAX0qd6Q.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-O5ODJ9QhNlmDelEDN48SEvLk5TRPJAOJYxcsq-PYPXg.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6885044231593748659.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VqHA_bXZeXnMkUR7jg8ad4GBW-nbjwuVgYrUBQu8uX0.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-lbuZLCvDCc1pUEaM4PFTkwOsF3yrmXBxDRHcTxjKR2g.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT--6i5fMVXFhv7zjZdpeOoG85fqPuMrPaGiihBO6w3yp8.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-tAuMo_WKpTRNR08n4svAvx4haVk0Kv68tzqHY4GZcuk.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-BNZbxkyCQrsq3X0eHpsE_W45Fss7csG7k6ozOshNtoc.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-oZslS9yazXar3NOL1stsn_QjwKKNPqWgoOjCewnlMjQ.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-wmP2U-isa6m9Ugo0asi6OR_GuTUJaEJLwB85ggeKzXI.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-YMtPSpKhetncvKwZXWW2B6GhgBl3Us_jMeqB1EyBSeM.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ndrC6RlDZhhmZ7jngGnSUBKedgEt1d5vCISf3iBP71k.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Ba9TVwrcRJ4gewxAbNjO83rxmkiXVYdprxAs7Lri_3M.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AAw_2ddRC-gKT4lcUncUnplBxFPfkF_qvp4YH3btka4.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-KZJQAg-bSKtJ9ZXUHr1D729qbF7pgpgCPfE3meGVtek.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-hTt3icN0eO4glhFokJk8lLn49Et-ubhb0bwG3tRlOY8.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-pR81wuw1nz6e4HmNGRuGEh9V3nI0nm-CNbG4caNTzgc.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-R2ySa3D_2TJwdOH6wf5N9qZ0kk2QjgD_UKvvGPDQoqw.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-n88eWZ2bA7SCz3F-m-_hjVbjM68S3nKzFljo0ngoz5c.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-n-NY-xOFpGkXbwqVfKOOaCa7VgjGNtTqQpAPAX0qd6Q.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-fy7-JbT2_2ATEsZmcMA9hANPAhkDRFVkg0WG3eLRiAQ.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Brq4hdHn__gWNCUsq2tOu3ioUR3sbXxK1ea1GiQAl0o.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6vrqFSN5xU90oV8APMNTN2jDEcBro1K6V-hxKIniUP8.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-OIvmfJqaag7A5BPL5FoXkU8sbmQuD8SmidMkzOumQy4.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-1GuhUT58gZvD9WdWcedfal9ose6621OJdYgdnTgfMG8.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-PIZ-hR023CDZuq6UB22lPPTIW3Asc95SokxEr9BYDE4.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Tpv56UPvUkzcKLa6nnYw32o2IYR6-01-kXqAd3JiuG4.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-bwy_ejfXRIpSpp709w_Ads1Sw8eCkZaMSDmBGmRx0A0.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-iHZZoMru7D-ndIcRf9qhuMWWVvWarCaAVqoZGG4MYGc.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-sLvJhSYS9b4uv2BakxldPbD1va1tfKVLT8ckwO5QlTo.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-AIiEgbaG_d54A6T9nUG2ajS27ttcUsZtrD7sW2RxTWI.jar
    Sep 03, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-fx1QO_llK2Sc_TpYhA7n7vcKzEKW3_1G_JviL7II9os.jar
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash d7bc81556863507e04ebf9ca2a386a3ba3bc79c464de01e152367a1617bb8efb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-17yBVWhjUH4E6_nKKjhqO6O8ecRk3gHhUjZ6Fhe7jvs.pb
    Sep 03, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-02_23_45_25-6528098257804220757?project=apache-beam-testing
    Sep 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-02_23_45_25-6528098257804220757
    Sep 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-02_23_45_25-6528098257804220757
    Sep 03, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T06:45:25.805Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:36.333Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.062Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.106Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.138Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.212Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.238Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.268Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.306Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.670Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:45:37.749Z: Starting 5 workers in us-central1-f...
    Sep 03, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T06:45:50.461Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2020 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:46:05.080Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2020 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:46:29.849Z: Workers have started successfully.
    Sep 03, 2020 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:46:29.881Z: Workers have started successfully.
    Sep 03, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:47:09.596Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:47:09.743Z: Cleaning up.
    Sep 03, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:47:09.822Z: Stopping worker pool...
    Sep 03, 2020 6:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:48:00.221Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2020 6:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T06:48:00.256Z: Worker pool stopped.
    Sep 03, 2020 6:48:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-02_23_45_25-6528098257804220757 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a56429b5-0c94-490f-8d94-6709ab2c4cdd and timestamp: 2020-09-03T06:48:09.626000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.363

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 6:48:10 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 59.178 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/bxjr22rw2nzzy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #948

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/948/display/redirect?page=changes>

Changes:

[srohde] Gracefully shutdown the channel reader in the test_stream_impl

[ningk] [BEAM-10545] Added the inspector of PCollections and pipelines

[ningk] Simplified some of the code based on comments

[ningk] Fixed lint issues.

[srohde] Add CANCELLED to non error codes for test stream events from grpc

[Boyuan Zhang] Refactor split logic to reuse common logic.


------------------------------------------
[...truncated 294.76 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 03, 2020 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QgRRrsHy5yGC-kk5xV_SNttf7Gi4ulg6k3lKAYQFPW0.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ERgB7xE7TdErOIJ65RDii3rcpvughlmVo6vrwpt8zaA.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-b8TnlGoedcolo8nqq__hXz4SObymjKpneh1X5bdFNdQ.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nuj4XhDQFjtRtIJM4mSmOaQRGUzd1sIjA7XU5f1P_7s.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-qauPXn1DVDTMc_Wuwd_UJZ4RPMYhe9xnP2QpRS0WiiQ.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QgRRrsHy5yGC-kk5xV_SNttf7Gi4ulg6k3lKAYQFPW0.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-VuGmv4qIv-Hq2mryuXq6T6fLxXgs_BFGAOxzxIr7rik.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-osWCcWO4gKVx91XrKiMG3ks6uQwNxUUgrJvEfvAdDCk.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-sOEzn257863PhDKX6LFpqqH3LLfRQdhSM6XXCBXJUH4.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-g2o6wNv2uXIM2re9RoFx3gUem4q9f7sfpyyWFi8217Q.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_0Zq6S3uuN9_s6d8ACdTqutK9Svo2McHMAVIczjMqcM.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-XdoSDybA9_0f7rtDr5y_PGTMr9J4_GGUFDaKkgLWeuA.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-TOlh_tHzSLGHnGikxDYjsFVuBw06SKgKeiWCN1q49W4.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-9eJY1g3pNY-Gl5xL9jNT1nfb2EmSRXLB7q0R9XDpSOU.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-UJ4b5N4z-5lhhIlys2gw21-Jt5hw45buYsWlPUwbMA0.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-jqb5MnVBi9sJ02X3-goSvSL2RUf4XbTi-mpKotzbo2E.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded--zHHd4i67d2NHHqK34doMVRHFCTXgxksC-_38ALZSUg.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-oyL8EHmdl7nxrHi2TgXScn5PH5j2RuSheqFpywvSYX4.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-q3_WYX6W7d50ubNA5h9PlwG-ZsNL7L5hlhgfjUTrsi0.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-xerSwY1cdtt2foGKl5eHwdR4ncxgGRtEu1YireZ4XwU.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-oIBKpudQb0mwc9vcq2QLnNBBYgn0yK7wPgCUpTNdXoU.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-E4zO_gRRpYA0xaoQizBkz6KnhcFdq9ubkOqXs_uYin4.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-FK85rFbrvFZvnaHBEmnnkg6ETf03wCmql1W2IaXwVM8.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-GUqRAhin8rBYVrvAEcU8_ezOjT8Q7MSQHdE0cKoI5E8.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8880456303994299915.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wCaazML703bpdVnXbqCqC4mQpZ2YH-ZWPSdZXisaKqo.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-SWM9vCvxX8KY1SmL44EJuAI95Nhl5LqKNHvI-dfb0Wo.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-_GbhQQwHqQ8z650DJ0dXsZMLgFK9hYgpXuJP12HxGMk.jar
    Sep 03, 2020 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jxqTviEGVef5lK5609siaz3ILAX_avreMLriSbnt4nc.jar
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-4TdFIMRfkn7LnLItHcKptXYqVnYTnTcQi1jjJkhjQmQ.jar
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-v4STDHST8ZFMnhwfq1ehDatCwV_IxYt7YGOPZD2S_wQ.jar
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-aN_cHgtl3HirOhWF4b1SUGt7HNhfRkORR6Q8Z-FscNw.jar
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash ccfa10dd054f47ad3b889ab14440c3a02d028777e3e54ccca48ace1dc9715597> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zPoQ3QVPR607iJqxREDDoC0Ch3fj5UzMpIrOHclxVZc.pb
    Sep 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-02_17_45_16-8867018529202483602?project=apache-beam-testing
    Sep 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-02_17_45_16-8867018529202483602
    Sep 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-02_17_45_16-8867018529202483602
    Sep 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T00:45:16.227Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:23.520Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.255Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.298Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.323Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.394Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.425Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.461Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:24.487Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:25.003Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:25.087Z: Starting 5 workers in us-central1-b...
    Sep 03, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-03T00:45:31.216Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:45:56.069Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:46:20.385Z: Workers have started successfully.
    Sep 03, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:46:20.413Z: Workers have started successfully.
    Sep 03, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:47:02.594Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 03, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:47:02.741Z: Cleaning up.
    Sep 03, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:47:02.847Z: Stopping worker pool...
    Sep 03, 2020 12:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:47:56.265Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2020 12:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-03T00:47:56.352Z: Worker pool stopped.
    Sep 03, 2020 12:48:03 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-02_17_45_16-8867018529202483602 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): da502796-96d2-4fb3-bdf1-876004b537d8 and timestamp: 2020-09-03T00:48:03.547000000Z:
                     Metric:                    Value:
                   read_time                    19.751
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2020 12:48:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 59.912 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/j7q4wsd4i3iw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #947

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/947/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-10808] Health checks streaming rpcs to streaming engine backend

[Thomas Weise] [BEAM-10760] Optimize state cleanup for global window in portable Flink

[noreply] [BEAM-10773] Use the image flag before the default environment. (#12697)

[noreply] [BEAM-10807] Add scheduled mail with metrics report (#12685)


------------------------------------------
[...truncated 295.31 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 02, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_PP5HNzXU72Nf6YVcDcrr5NtF1qo87FuTSaQQhrz1ms.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-S8_4joJ_ADMEtTIZ91pTJzs49HOMtTSlgFKfV0iKxY0.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-X_thT99pYsemt3gXGSxw-TIOa0E6O-PM-Kg4GvtAU8c.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-3El28RWaMI7sqayCDgKBclzWcXp3Ju6r0yyWDH7Cv3M.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-6Nf78IeQgHttBF32PI00qSithDkQhdwkBVZhIaiauhM.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-jcibQ-EmbfzlsKg4OxmvEnQmraV4swppbA3EieJeINU.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-KKzIaTa3-yPMAmj1V6vzO90Y8swE-dYZJ20Qtw3t46I.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_PP5HNzXU72Nf6YVcDcrr5NtF1qo87FuTSaQQhrz1ms.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-N1SNkggOEnvGfmPZ5-BdEQRKrPlPTtuYyCX9yzPTT3U.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Gl151wf_JpGqKQs8X6PcoKKqElIrO6un1mJ41J26ZqQ.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-yzX3B0liu1PGzyilpq4JnTAr6-XN8ey2O8O41WpiN5A.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-0VjKBULgTSJli2c2CYc_HzCDx7BYw7_0vMcJzTqJOBg.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1308137050959729135.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-uBFhrhTcD3_iNsZQqg4uYSGcsUfSDNpaYBUnPKoF7OU.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-nFqoz841vH5Rogivn6Kx8KNSIzBaX33iXx_FkaInYLQ.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT--HO2Sr5RuivMZQEaSOR3j9c7viAh5dfcz_fOK2DjZos.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-c_40ZsPSzVch6xMpvNe43S6W_wl_F2zJub7inr0RwqA.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GDXmURIr3VKFsjRijf_R7quCD_oUtYY3mBdNWkWGM24.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Mo8gJgMuI8vSNkugqmgUcbX0X3jPnOXLcE0bMjJjY20.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-iHE2R2ISVxWxgNrkLILNKyT_xoHHUhag__gE7HeebEc.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-rZVTl7TRrpEbeqYotZoEcyuXbntKbAR2jYRc6grsdk4.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-CBaJ1O1BmJISJMdbtB3GH6lOD-h4eG-2QnueY4vYd_E.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-U8-Smjo-nJKOVPTVUDs6VQlp5kaA0V5Q434muaSNjwE.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-kXopHj6bN9pnZqM-OU3hCU8Fan2xWLEo1djynCO97S0.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-YUE9gy2vMrN2BIRyZ7hjrcJpS7EsJD2xLc_p9-2hvmc.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-pNBieWjsKmh9b0sv6MewD9IXjyc_3PynplliMpHlyx0.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-0RxEjG9rUA_JMGvamk-DW_Eh0dz3GV590-pAZqBLgZI.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-gb17k0JbqZfwubjs8O5Wy_km7yJSLw9ZLKNluoYTfK0.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-HSaBwlhT_PUQBMcmTWAzWHLXXazmWR5CUUzOMfahss4.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-eIuylK3PTcmCA2tWB5PDmhjtWIr-uyqFkTOVepEpcBo.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-PstTLuzTm2uiR_GbroR9BU-_9Z8pkx-wNCs2DdiVblc.jar
    Sep 02, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-efFn4Mt1TGm1X92irXdvGNznnF8vaLrBvSFAjrEtdL4.jar
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash fc2eb193cd4a19662eb7d6118117210804011db00f1def2fa2e9738db430f576> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_C6xk81KGWYut9YRgRchCAQBHbAPHe8voulzjbQw9XY.pb
    Sep 02, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 02, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-02_11_45_33-4552928328611625561?project=apache-beam-testing
    Sep 02, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-02_11_45_33-4552928328611625561
    Sep 02, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-02_11_45_33-4552928328611625561
    Sep 02, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T18:45:33.829Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:41.586Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.537Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.586Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.615Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.689Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.714Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.750Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:42.783Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:43.137Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:45:43.204Z: Starting 5 workers in us-central1-b...
    Sep 02, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T18:45:50.120Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:46:09.217Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2020 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:46:34.054Z: Workers have started successfully.
    Sep 02, 2020 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:46:34.086Z: Workers have started successfully.
    Sep 02, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:47:18.519Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:47:18.674Z: Cleaning up.
    Sep 02, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:47:18.746Z: Stopping worker pool...
    Sep 02, 2020 6:48:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:48:10.620Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2020 6:48:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T18:48:10.663Z: Worker pool stopped.
    Sep 02, 2020 6:48:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-02_11_45_33-4552928328611625561 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2a13de41-6f4b-475c-a2c3-d7a2dc2ce4ed and timestamp: 2020-09-02T18:48:19.749000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.975

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 6:48:20 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 59.391 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/haahh5nr3pacc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #946

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/946/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10137] Add KinesisIO for cross-language usage with python wrapper

[piotr.szuberski] [BEAM-10171] Update CHANGES.md, portability website and kinesis


------------------------------------------
[...truncated 295.82 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-VY9dxbDnfD6DKs8jC9bg6DA8IhFOaNyv2SPMvfF1hIg.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-B8P5pMCGqats9s7eV0RcdfmPBD9XcZculT8mo5sVt6g.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-dzULoruduafeK9twvbzdhVEjuUA0OLcMM-EgKo9zWBE.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-G4t3VxTcvF7yzVeBAUdcbh_pUI7yWlRouy8lMObHSlE.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xHrKLTrE7twvVmsJnoLFPmqWEVNlAO1cs5HpTqeq090.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-dagCw1Lauxgrqi1XDvhHEwQHXClJMcFjd0YK8EFKWAw.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uC_QGz_2_NzO8B1KrIBVcxQy9nyA8CjuujMpDQchmHY.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MfwGXRj3V0YSSZ6yDHe-TJ1r4Y5yvJzW_HPi4CdwfPY.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ArJ5MqTRgSBi11Sbqy88wFHE52FO_n8spPJagrNRwDA.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-3i57-OR59D5eafeIvREJutxF5IfWUpojrCuTcF6cf0c.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-taBYzK8lOWR6LD2b32buPzNJZeJovDYzY7VUmeszet0.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-aS1VQtYVeKgMPC49vKVmZXKhyYh4l39cQE39oR87QAA.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-S-9tbkudLI56XfySy7J3OkVe_ht0TPRddy54SOUml3g.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Zp8J9mxi7kg6IxPrvSQlPihpCpO7HHWn39KFQwFw40c.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QwGfgujFI6YKCuPaGbTagTaqpQYvD5QPSeXidncEQJA.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BVEuWCyRirmIxPQvSzds8-Cwj9fx76MUWmV3p6k_HTU.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-rISo-kMVSWmumCYDMfXAjExFxN3-nPH-tS24m4TzQ84.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-B8P5pMCGqats9s7eV0RcdfmPBD9XcZculT8mo5sVt6g.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-I1w0tkizk8RvwppoExZReHMbo2WszfTSZ16CdjudIsg.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-sEA9NXjKl7_W_SvhVKYSPMRFqVJ-ZmxTKdPTXayJIN4.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-H1e2vO_3vQmhs76iqyxtumXs6qqMoAnOQ8bsJGCn_A0.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-VfvkpZHBlnY_BCjnQzET_Il82QMzUeSSR5b8ZexkN5s.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6662921369214484832.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ICkQDbO9_zAiHldC9kPaj4fH-pfjhBgJEhIJxJKyOqw.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-V8M_MqhKQdbrIQKrd1Vw1H7wWRY6lvEnjVB87DbNPzA.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-_qxVsiuHbaDu6291ltXXbF5XouBuYWidNwOGBvZHews.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-YH_v-WdRdPQ8-j0HkHo8tpzIg5pqQsNg-hZ0IEyncx0.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-X6InTklmB5QQ8EzV3_Mi8demLVCPLZwShN19PdldHM0.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-A4VbsKNHTJeP0arw4jSFYr3a5_yXk5KwEtZcRQt1XSc.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4sxeDhvpww57xOiOjQKL0iLBlt7eCi_-qPkwL0mBApw.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Di-B2mVgKgkjVZxuhy7w7ouwHrEXz7FQ0e8j6NZErOc.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-3-lz7fF9ArqvB7eGkifjKHGKV4oXB3PbRLWfu34MZ2c.jar
    Sep 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash b24c7bd969a78fc71739762c00be5aa23492e177ee04a1787127a4db6d590849> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-skx72Wmnj8cXOXYsAL5aojSS4XfuBKF4cSek221ZCEk.pb
    Sep 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-02_05_45_20-9891651083120569959?project=apache-beam-testing
    Sep 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-02_05_45_20-9891651083120569959
    Sep 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-02_05_45_20-9891651083120569959
    Sep 02, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T12:45:20.560Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:28.554Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.390Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.427Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.454Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.508Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.544Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.570Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.590Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:29.978Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:45:30.053Z: Starting 5 workers in us-central1-a...
    Sep 02, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T12:45:43.016Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:46:01.390Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 02, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:46:01.423Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 02, 2020 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:46:06.841Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2020 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:46:32.210Z: Workers have started successfully.
    Sep 02, 2020 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:46:32.290Z: Workers have started successfully.
    Sep 02, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:47:02.474Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:47:02.627Z: Cleaning up.
    Sep 02, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:47:02.700Z: Stopping worker pool...
    Sep 02, 2020 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:47:59.959Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2020 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T12:48:00.004Z: Worker pool stopped.
    Sep 02, 2020 12:48:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-02_05_45_20-9891651083120569959 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c397d378-50e4-4536-a562-a54c7186cf0d and timestamp: 2020-09-02T12:48:09.158000000Z:
                     Metric:                    Value:
                   read_time                    11.305
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 12:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 2.507 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fpoc3ca23hvew

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #945

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/945/display/redirect>

Changes:


------------------------------------------
[...truncated 293.51 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-KOd-t3lrI9tsXrTWAeDYSqf9ltd9eVn-hOqtfEEzkJY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-DzVgeBQnoH5N4nGqcpL7IDTUErgLWWhA-ewqkGramgk.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-6e6D4MG2zUe2_RcphhPzANMJAUlzFfbtMb-WX0njcx0.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-gC7LjBf0yEmsfQpRIawwOCyv8XsEZ9dDZpUyNrjllc8.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AsW-Iya9UMeKB2toRxhPUmxo13QkCzYupQhLMn6lMF8.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kXg64LFC1m3yl9k6vFHCxHnZ82WH9SJW4AcFbV70uNQ.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-EvmN2bK1V9Ey_Hd86mxBjMPk8DZsZvuh8v2B1EYNIRE.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-M5XD3kATKT-YXM1rBAMOMFkeeIIbnbUINP5X08YLbbw.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-pSYIPf8L8X6DazhjuLzovtU9j-5jBOz3lHLNscMSO8o.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Fk43MUqGaRBvbd3zLXTFyu1alkW07AWQeCprwxWcklQ.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-rlvOvtZwOsuI6mEuvegAnGy-v6Ui1q_gc7JiQmmCZo8.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-58zkxMlwQPukP7fKWqazYHLIONF_aH-amSYnWE3cYus.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-nIJJz_2DZjQFUmGWuUJQwRpDrdh3fORm0nZl7LP3gHg.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-UZQmo-TFJW0y0g8_FYXyjvnp3zlU-n2TEhM8LJFAJr0.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-YIUNlOHd7zbKvo2C7CBZ8iKjeXSEwQ-XR5pJxGAvr_o.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT--YjAhs084gFPg6nD5koR0EAyJELG7sP0PWBhEKRDZQk.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-71N6F8-ImACvAszX1JjieJbtFqD2oyQkcL6-LZDlPB4.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-RRpZw5ZnCXKWNFlsggTeUX50ZFfiAvAquzPPeNRKnvY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-T4PrL4W2FJPotviYc-C_5PgkpWqEfUlMcKyQZ77OgX8.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-0QOvfm166pa4QGvQ-NjSiHNZna8Hu9ygFwP6ZTzhdBc.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-AD3Us4kHaTaNoEGpvQzlxv2e0ZGEDnOlrLPLqM271VY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Jnw848QFDk1rMNXF9-o3PE99m66Q2HqEZP3d57flWNY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2066430202035505090.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-bIJ2k3mWJb1oLlBjmqP5v8xbQ4b_gHo-ICvdn1ffSpE.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-EAGnyhsIp9MqK2SsAX9vwmIddcX3tnckhHJvPsJQ_7s.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-C3HMwAlTNmNwXw9hDTmHeTUUxhoJfZtc5ibCCSauzhw.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-fCGnuyVKomXI1GhtyKCk3cLWtH3XVumSGL9bdmBn3Lk.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-KOd-t3lrI9tsXrTWAeDYSqf9ltd9eVn-hOqtfEEzkJY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-eUPlgD2vpfXjvqpO1QNzD9h2dAzEDrf-8ul_LWO59Ps.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-A8AQEw7oOuAuIFCFbgzr7VAYP0in7VDuq1mxVXPvlMY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-iwawgpN5arSK3n2oksKzByt_4Cfb_82inUwvIWOk3hY.jar
    Sep 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-FZylliZthsgaT3dseX_o-Ei0KsCxkwrK_0S5YXi_2Ns.jar
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 4c5229944f1b76409bbf6976d4f7df94388daf1a5a3e12958de7e6c4079604b6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-TFIplE8bdkCbv2l21PfflDiNrxpaPhKVjefmxAeWBLY.pb
    Sep 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 02, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-01_23_45_18-9554033431803034719?project=apache-beam-testing
    Sep 02, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-01_23_45_18-9554033431803034719
    Sep 02, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-01_23_45_18-9554033431803034719
    Sep 02, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T06:45:18.981Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:27.353Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 02, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.175Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.218Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.255Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.332Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.359Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.393Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.427Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:29.999Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:45:30.081Z: Starting 5 workers in us-central1-a...
    Sep 02, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T06:45:35.054Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:46:02.825Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:46:21.582Z: Workers have started successfully.
    Sep 02, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:46:21.618Z: Workers have started successfully.
    Sep 02, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:46:56.628Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:46:56.857Z: Cleaning up.
    Sep 02, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:46:56.966Z: Stopping worker pool...
    Sep 02, 2020 6:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:47:48.334Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2020 6:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T06:47:48.377Z: Worker pool stopped.
    Sep 02, 2020 6:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-01_23_45_18-9554033431803034719 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e99441a0-b177-4117-a71d-2497c7c994b3 and timestamp: 2020-09-02T06:47:56.621000000Z:
                     Metric:                    Value:
                   read_time                    14.238
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 6:47:57 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 50.902 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yizmjht4qlu3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #944

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/944/display/redirect?page=changes>

Changes:

[noreply] Add note about state and timers in the face of retry.

[Robin Qiu] Fix NPE in BeamUnnestRel

[Brian Hulette] Set SETUPTOOLS_USE_DISTUTILS=stdlib on jenkins

[noreply] Copy subtransforms to output of translations.pipeline_from_stages()

[noreply] [BEAM-5122][BEAM-7582] Re-enable ignored tests in PubsubJsonIT (#12750)


------------------------------------------
[...truncated 293.96 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 02, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 02, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2020 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-CkByS6Vi03fnzYyqRZf6A6bYN0DftN1s1bYva6QmisU.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT--ohKBsTaNQURNx5t-hKfPuoGPhfjXHlFabXyPqxpX24.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ALAUMGTQmVprbpUxC2C2LXOySh254BuQxa2wFbghLG8.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-kxy76euLTR-hxd7v9VR_4shqXF5_F76JqNp2OVtUEmY.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-39v1VYsV4jTXMLATRpBUAFQggwUI2rhhCxvy-XvlVCg.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-SmgJOQC12P__RBg8nlVh8CqfVKludDDuvyZKo5-ePRo.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5115452506011802753.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9jDpfIsf325FMZqfDeOII_vMW34pJbNNHOezb3eQvwE.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-2POUoEAdhtSvPKsHyV1_6JkBqIWA2TaIfHScj2WjRRo.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-hj1sg9BolUU4VU-u6cvYtcvv7XpbMmIe0_ilKdwyU3s.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-likTgPApSFeEGNKNeYHS3XIQcR1zlrCSLZhOzN66MfQ.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Qey_QlKKA3R7HajD6iSyH3U6LEtCMXWU6DkSBcDl7Lk.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-C-xvGIjwY30Rcp8APci-A0bba5u3Z4ma4ax4ldHVuQ8.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Fo4kNEmslfnUI7Hz59BondFGK3wXp8bxasZr0MpQ9GM.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-N05T0Fg8d2xnex6bA7fYm5hiO3zCdcp3Uf6gMzTMrr4.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-CkByS6Vi03fnzYyqRZf6A6bYN0DftN1s1bYva6QmisU.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-lv9ib10pZchoqxmDDnD-1TRDCfA_-xQKlQrZnoWEd7k.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT--r1ayfvpqc1kBaHd458oUqwdAPgjUcGVaaI1WEvGIvI.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-JEbThhy5_oAiIZLVya_LjLD7_Ak9NhwrwfW6Tehp0cg.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cUCCUP9oUQYdqvjs4KZr2y9Ue37F9XSa3_wMVM1Bw6w.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Ci65Z2NK88peWLKe9RuT9Qu6dG_q7vk6vvIFeaXT-w0.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Xs9z8YL5LG4pYMSASCEEHrdMOVZH1bf3djttjbdKFao.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-BkVfXy1F1wB52G6FpnAuRvhUc_LkRgS5mJpgq57mg94.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-MibRpgaw1cvCs5SVbCsGi1cRIATJ7-eDaAnXsl_u2eY.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-PCvZ_wWoBa8FrTyWAMB61HlfbS0vBQBa5cbirnQcakM.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-4_Kt6uf9gfR62EKHfrhLffYuB_17EiGn4ylBxodX9t8.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ACvUfwmfa7udi13vBkiLpmjHeCXY6QKgaEbR26H53ec.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-RYpse57sjQBi-hVeKzBZHWJ0CmK71Wnorede_9SpD1k.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-gQjMjFdy7GxVPMGmtuoZMtxvk91wqimlsmSCJkCf724.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-msANlhpWsP6f2SPCzouG_w36YpMGImq7dP2vzZYZOyw.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-5orN1eLcO_ktL9gOuXlDVJtK0ir-oNVi-EOn_yo8-7o.jar
    Sep 02, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-yhw8YkXd-a0D3MZFpppVucM4Xr2rhDLpCI8d6V9ckH4.jar
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92058 bytes, hash 0ea7490b6a4b39d4060edaa623413e83f2d43169bb9894bb0828bc36cc363783> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DqdJC2pLOdQGDtqmI0E-g_LUMWm7mJS7CCi8Nsw2N4M.pb
    Sep 02, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 02, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-01_17_45_30-10900350396300131671?project=apache-beam-testing
    Sep 02, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-01_17_45_30-10900350396300131671
    Sep 02, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-01_17_45_30-10900350396300131671
    Sep 02, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T00:45:30.617Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:37.734Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.548Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.581Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.661Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.732Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.760Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.794Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:38.831Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:39.162Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:45:39.241Z: Starting 5 workers in us-central1-a...
    Sep 02, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:46:07.121Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-02T00:46:11.808Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:46:27.448Z: Workers have started successfully.
    Sep 02, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:46:27.483Z: Workers have started successfully.
    Sep 02, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:46:59.456Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:46:59.601Z: Cleaning up.
    Sep 02, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:46:59.680Z: Stopping worker pool...
    Sep 02, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:47:51.269Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-02T00:47:51.319Z: Worker pool stopped.
    Sep 02, 2020 12:48:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-01_17_45_30-10900350396300131671 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a8524f8c-4c06-4edf-8d7d-a8073690d3c0 and timestamp: 2020-09-02T00:48:01.102000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.527

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2020 12:48:01 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 43.597 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/ausgfzcqeuacs

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #943

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/943/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7705] Update BigQuery docs with new code snippets

[Boyuan Zhang] Wrap RestrictionTracker in SplittableParDoNaiveBounded

[Luke Cwik] [BEAM-5002] Upgrade SLF4J version in all non-vendored artifacts.

[ningk] [BEAM-10775] Added nodejs installation and documentation


------------------------------------------
[...truncated 299.16 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 6:46:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2020 6:46:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2020 6:46:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8i-lR4XhB3xXAdj9md2n7qsLyY7qfKKYzJKhP6ZrEmg.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4977314796790561547.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-eZzWxnSVradASxoCWOBEBQ_bd14MeCwjNktqNJEVZIs.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-NX_8gwHJliqCff_LLICstZUREs6RZpGLQTVDItynzgk.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-5W0zG9xqmZadMNkh2ooBZ-zePMfiS561mNBkBfZr15s.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-zYnemsAEN85Wz4nww2JCxTzor3mmWD0xccM0ozOiHgg.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-oGRnOhV7Z8s2Nqyy4SPlAqqJ5eenTXMGdHlFjTQtIl4.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-8RnPJZc3DjIdgys9A07j95Q63YwO2mYS0JRE2LBBY1U.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-DFISKrGLqoHt0xKV8McnwOSzDoJEcQjhbpjPsjZtJVo.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CmLANFeK_5YFQ1sLXEEAIv0xQtvP1wzdL5EgLze7StA.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Y-ndT-QYbH0o2ZYHt4jSJcXURHHq30ciEaMfVVHUK54.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-s4INpzdzY6J0ZikkZjc3_pAX8oqLy1jbG9r09h_pscA.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-DG0ptfkqLldNsGW50j1ae1gBe9CRVG3lYRnNLhLCE1E.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-7_H_Z_l_nOAd1gSYeuL8N7q4nxsf7e18_Sll2NNa8uU.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-141cEpJTOJZX9fw6ZNssN_Jb8TKCkgvgfnXye5ydINI.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-D7gYo2rkxhtFrrRZ_xvLyRmuaEDMe8PDsd0OJMqA4EE.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-YnorRRQ0est87edOSkEavwI6VZ5TMuiLC35Xtp9nfiU.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-H971YkGLJgn7Wp2R1Tn_OSC9iqFSfi1PQTOaie9rVZ4.jar
    Sep 01, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-sAec5WUWEDnR4UytOs4kAEAF7NuJwMGBialcUUeMeLc.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-5kDCsV21eXnl4XtrH3SXLi9MewIykCr9zVfS-KjnPLg.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-51DCNfCYgsReqGE5--gomyutpZQuxtBjX9V2Jn_-BRU.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-mXfBO4_YysuWhPLJ098gHET3sv5ysxIP2ZB28FRYjdc.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cjGeof1INSy5nswqeUAd2Lq-X8D-epYXeYY3iAZTLKk.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8i-lR4XhB3xXAdj9md2n7qsLyY7qfKKYzJKhP6ZrEmg.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-yKAdEEoy50A1viTOt_d_BHRzSLVJYKKF4VwQZBxJdXs.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-CnKrbJ8n7FMQJNIf4Gatrz_aEaZrvtVbNg8L2q4W0-E.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-rOFO8tb6uT4lJedNGsa71JoGEnD4J5T6EbIYRrLCfdg.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-W7BA6Zcihh5vFOk7SRTxB0C8S9tUh5zDLKnoO93E-M4.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-m99oTkyzG9HlQ6uw9QWxrwL-EYgte8RiugaJKKTb0QY.jar
    Sep 01, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-yRopVnfdIR08nv9hi7H0Oo3-pSgY-7emXMYOmX9AFZc.jar
    Sep 01, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-X2uC3J7CExuJ5DvRvId5w00v2uHdRUaYHV3tsEpBnd0.jar
    Sep 01, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-DUe-2eTJ77BaQhXORMAo6lDK_Po9Q6hwZsVBB3Wxz1w.jar
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 3 seconds
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash deb326e4db79bd1f64c35bee48b53dbfae4be619e30033a7b9102a1cfe14f63b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3rMm5Nt5vR9kw1vuSLU9v65L5hnjADOnuRAqHP4U9js.pb
    Sep 01, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 01, 2020 6:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-01_11_46_48-892135961897202337?project=apache-beam-testing
    Sep 01, 2020 6:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-01_11_46_48-892135961897202337
    Sep 01, 2020 6:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-01_11_46_48-892135961897202337
    Sep 01, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T18:46:49.010Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:56.572Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.563Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.603Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.640Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.770Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.808Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.835Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:57.865Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:58.171Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:46:58.255Z: Starting 5 workers in us-central1-c...
    Sep 01, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T18:47:17.233Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:47:27.925Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:47:27.953Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 01, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:47:33.379Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 6:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:47:52.693Z: Workers have started successfully.
    Sep 01, 2020 6:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:47:52.732Z: Workers have started successfully.
    Sep 01, 2020 6:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:48:30.135Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 6:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:48:30.281Z: Cleaning up.
    Sep 01, 2020 6:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:48:30.363Z: Stopping worker pool...
    Sep 01, 2020 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:49:24.487Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2020 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T18:49:24.543Z: Worker pool stopped.
    Sep 01, 2020 6:49:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-01_11_46_48-892135961897202337 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5b364442-9020-4561-a9da-de2ca74e540a and timestamp: 2020-09-01T18:49:33.813000000Z:
                     Metric:                    Value:
                   read_time                    14.291
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 6:49:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 6.837 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/bnidex6hcqipc

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #942

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/942/display/redirect>

Changes:


------------------------------------------
[...truncated 294.02 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-XitvhSDfN1N42MZL-cYzxKDVSIcWUqQ5YEnA7D08N98.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-0LoAsGl10nI85VQOPh3bieOugF9wQxh630nAcDyfHNI.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-7F8KWRDNb40mOOQlrvu_rdB43PfrENQdBvPuALH_IwQ.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-wZ0U6_9q6EdZIzBD4dmhFt8hV6d5AJ_JQxlT1SnGlQY.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9aBDkyLLgmlfoRf_XNlQ_UPd3wXVq2yRHdBidrJSKAA.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-9oC9nsLrhZwHqiNHwrPl6Oir1fbpLJdwOi0bFhJtKkA.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-XitvhSDfN1N42MZL-cYzxKDVSIcWUqQ5YEnA7D08N98.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3325441058382901605.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kSB2evDjyEsZ1z_MYnUl5tcwz2C3cV6FrPfDNt2Zn2E.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-aSMNGcmy-VG7LQ9iCWXFjyqnyOv0QEG-UogCClsbc2o.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-xMCA6AL27CTjRFcLg3FqAjIVJswlf5FK73jfioKOAJI.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-kylu_mtxjDpL7h0N36FL47VUWFIHAMSZsotWMbWKyJQ.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-EBaRJ130aa6RTcvBvM02ARXKcYDZGApX4oPPiw3H5IY.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xbxZ2jhm-FmmTCW3q9SXuN80gT-dYM9qTNWOcic2S2Y.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-nf2SDI5wCo0L1gmxmz4Vo96e8mt_HBgJWJtR9mMHpc4.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-ogd0Xuu-MiEPA4yKv0i6YTYsbojMqTyhv60WtglxgBI.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-lZdupGIt8OijHoB479fvdSpDnp-3iN217dz4fjxAdkc.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-WSt-QMufSuY06ELRmOMaIU1gms2w6jgfB0WmRa1YcNs.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-SyeqXGgdfo2eR0TxH48ZPDdDic34LH8f19l2JC34f2k.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-iL3tqfu4SOA11Rm553dP3Ek4B8UHTx3xcGEOaHomPkE.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-r_6wnkXDtQQoCcldUot188NnZ0NRyOoBd6k_dGtEJhY.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-u_8zHQvQARStKv_k6AFy037R2wb_uY_PkNo9GEcf_9Y.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-fUWpxpMK3oQ34rzkvA2TzthSwDPJ2jfIPerTZ7UZ_10.jar
    Sep 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-vqOz7amy2HyW0mrk8bi4Pc6hn3bf0T2gxFXA5s6JeLQ.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-AUXwR2J9IKk0S6VA92Ta2knt0kJzJQBI_KZRB_56g40.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4LUkbR_I56SYaeMwIoH0ixg5Hbe6XYxPQguj4pD-M-c.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-DJ0npGt0D6RE_wro75Q3PoErsL13D1XBaaSMnmw_65g.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-fzVbk9-JI6wreSM4bQ4BlzW3vj58L9Pqqsq3VnFjMF4.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-T1HiR9cF_zcHdcNzRwtGNOsV-gjLK2W87d2AV0piAQw.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-zyViwD7kTNACTpG7RK0WxhBk70UMcpqWwOnRYPuBwcQ.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-G9WuDCuubpVx7OKQmEJFL4sZ4vVtWS2nf_7hlxG58SY.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-hl4Y4kELYzHxf1DcFgaeivNPhU03nTp5FfEAoz4w1us.jar
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 1a1735361cdeaa1c5fa5864d3c6e5bc6ec11dc78481627d4e50e8e7a38a6b7dd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Ghc1NhzeqhxfpYZNPG5bxuwR3HhIFifU5Q6Oejimt90.pb
    Sep 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-01_05_45_21-10470534112471675755?project=apache-beam-testing
    Sep 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-01_05_45_21-10470534112471675755
    Sep 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-01_05_45_21-10470534112471675755
    Sep 01, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T12:45:21.451Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:29.641Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.343Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.379Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.404Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.480Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.517Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.541Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.572Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:30.943Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:45:31.008Z: Starting 5 workers in us-central1-a...
    Sep 01, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T12:45:56.433Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:46:04.707Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:46:04.734Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 01, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:46:10.123Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:46:29.348Z: Workers have started successfully.
    Sep 01, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:46:29.382Z: Workers have started successfully.
    Sep 01, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:47:10.078Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:47:10.227Z: Cleaning up.
    Sep 01, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:47:10.305Z: Stopping worker pool...
    Sep 01, 2020 12:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:48:07.890Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2020 12:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T12:48:07.937Z: Worker pool stopped.
    Sep 01, 2020 12:48:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-01_05_45_21-10470534112471675755 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 070d1263-d0ff-41e4-80b7-885d692d4ca1 and timestamp: 2020-09-01T12:48:16.854000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.403

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 12:48:17 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 10.068 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/o7iys6pivegei

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #941

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/941/display/redirect>

Changes:


------------------------------------------
[...truncated 294.29 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 6:46:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2020 6:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2020 6:46:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-u8q2ObEu9yXpm65SqBzxmqqQHwEE4iGSRGOp8qQDC4o.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-XB7XMBHErQQhBEtMjmMz1ehTi-YTk7_X-cALBwAo0Wo.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-jOcBbI3Dk7W6EK7fc7g7N4NPIKOxr2MI6-1TrNRHXQs.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-rpeL7-uxHh4X1iDYi03rMltQDneEbWnt13z0Lgcwzvo.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MRXx6U30tN5UM6jHl80wqWrguBuR_Xg8w2L19ymu3Ew.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-BWH_chtbREMStKaX9FPGlagIIbePyCmbCfXv10WWP7M.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Zjcho1E3BiJlLAXdBJHClpfS9yWHbmFA6_9z0KMJYRU.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Mt1-vkL4Z-9A9zUrwu1nYEM1JOm5bsjnXvM-DbXqmgQ.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-RBB3nsrTVKpGsqPo1dlXL8r2xfzdnnWTjYwWxu2499w.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-pjWq2h0xhmAhhUxtxhPjSpUySCmKEc_R5RtpPqHqois.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-2ydmMUwqH2V7etjXWVmhATRnYIPUMK859569U-W9tSM.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Thc15ZjKWMsPfcbTJLSasmO9LlIp_su4I_lEuGrmow4.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-B9HopbnogRo3ImJqZx3SIGLZWm9mzQ4156bGT9spg-g.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-LIZYgAyaocgJtQgKN-3CKjNDR63ZlOgkllXC7ph9M7g.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-u8q2ObEu9yXpm65SqBzxmqqQHwEE4iGSRGOp8qQDC4o.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-KQxbiEe4M-VYS6F5No_32ZYdp5sDz0aKBuRljmOMuqI.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-bpV4YkpDFk1bKSJHqQ-ZMtf3rfeHe12HZyBooxAhGFI.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-_smkOnWPPq5MvP3mn5PDeAU1VT570JSdbCgKNchiciE.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-desUHJS6IqknAELpbKq4cIcLjWgaAmrt9bchPHm0AvI.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-mHcTbju4pb0PKG4eXmco4xZpdmrLE6nsXkwdkV5kFio.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-QYy-PmCFZPDikhgyVT2Zk-CeL7uv4KjJiDafmpwruwc.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-XsXhMm_4G9ByMX5_iPrGm1doub-29WmbE0mOJb26IZU.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-QkMKZ0ONx9Hc2yYDFoNx6rgaYMO8maDGtKVMj-WrVz0.jar
    Sep 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-L31YqAkQUQ8-ObraplcsFN8jDinIAMqhbge2xVTEG90.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6865256341034339193.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZwuggYlouifUKueSzhcgsDppueho_XwCW7KTVu1yIG8.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-z3pxOoVxFx-QkRB6FjikVFgDSLJgJUOvrY7R3gvxHLA.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-mQwgJwhQpN8tVUvlUFx24ttUryMksbP1FUJwHslPrWg.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-8NrU4CkO2LukJ848QiKNrW49RxqTunaNd0ouqLr_qvg.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Boojf4QKX__MXwGPbVfm7EYqOLupYUkvuhBpO1lBWg4.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-NgmghPXmWR6DGjM93z5yXq18ufQSblV1Q190UnLqqVk.jar
    Sep 01, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-r5kMoQkg0TzL9SoNICIyU-SQ2S3nRQ_eso2mXXlsrrE.jar
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 2 seconds
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash faa4d029cdee90ec4b871b55264bd4174bc7075d342128a85f6e9f70ed4964ec> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--qTQKc3ukOxLhxtVJkvUF0vHB100ISioX26fcO1JZOw.pb
    Sep 01, 2020 6:46:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 01, 2020 6:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-31_23_46_16-5292560091599181982?project=apache-beam-testing
    Sep 01, 2020 6:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-31_23_46_16-5292560091599181982
    Sep 01, 2020 6:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-31_23_46_16-5292560091599181982
    Sep 01, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T06:46:16.307Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:23.901Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 01, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:24.688Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:24.739Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:24.775Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:24.882Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:24.932Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:24.973Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:25.009Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:25.661Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:25.734Z: Starting 5 workers in us-central1-a...
    Sep 01, 2020 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T06:46:44.991Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:46:56.199Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:47:17.389Z: Workers have started successfully.
    Sep 01, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:47:17.424Z: Workers have started successfully.
    Sep 01, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:47:53.942Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:47:54.113Z: Cleaning up.
    Sep 01, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:47:54.209Z: Stopping worker pool...
    Sep 01, 2020 6:48:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:48:48.179Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2020 6:48:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T06:48:48.220Z: Worker pool stopped.
    Sep 01, 2020 6:48:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-31_23_46_16-5292560091599181982 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 21a99150-85c1-4501-9b4c-82c67c08d27c and timestamp: 2020-09-01T06:48:56.431000000Z:
                     Metric:                    Value:
                   read_time                    17.337
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 6:48:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 3 mins 2.54 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 36s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/pllhdte4a73x6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #940

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/940/display/redirect?page=changes>

Changes:

[srohde] Fix BCJ termination conditions.

[Boyuan Zhang] Only encoding element residual when it's not null.

[Kyle Weaver] [BEAM-10825] Fix null pointer for non-Jenkins build.

[noreply] Supporting delimiter for GcsUtil.listObjects (#12732)


------------------------------------------
[...truncated 297.52 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 01, 2020 12:47:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 12:47:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:47:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 12:47:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 01, 2020 12:47:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:47:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2020 12:47:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2020 12:47:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-MSB2xD0-1dcNph_qC4gWRoLPGTGQVpRv_02FARQJOng.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-iMVfFROKB8faJqarcJSjhtglAAvbRFnx2raYeN4Mlts.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-X-57636PYOByTIig1x141imcyvGjwu0SZQql6P-YLxE.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-MSB2xD0-1dcNph_qC4gWRoLPGTGQVpRv_02FARQJOng.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-W0pOmKE-qMbknhsAhRgPIpW3qiG5S620nWlRqFb2Nak.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-EnnYqzDfES8er1pC2rqKUhvUJeK8YOOmq5OO7RR_7NQ.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-y-8DSL3O8FNAdgdSC8JqAaye-mS70ddvaImd0XxnAe4.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Hi01ofJQSDaW8qrsNqdeQvPSaCHnO04d3dIhR25IXpg.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ijhgEqhIpiR7GzEUHpe0NN5eB4DmMH-A9mYIOMYOELs.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-mWCCmJGt-RLWFb0aSmxFxUn4NvKxbQicqnp5hIiQDbA.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tGNQzoq0T2rJB6seTxOSueyshNubz3MlvtIBpyoFRJc.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-HSSpvvsy5Df90eXwpyZ58xoyt1vaK8un_6SPRrzehNg.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6814348078372259496.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2ZrSJq19A_YvQe2_3h2iTjd-Bxfw5Szfl4EmMH6O1oU.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-FX8HtD27YqeHbS0-IHIymGSB39IzTbygDkxcXUWIIow.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-2eGNbSlv8xTXL62xSe89Gsk4jgeyDZFHUt79p7W8HVQ.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-P0GFuk-MNrwNPfR01R8TTDawWNBu1nG39LIVBECTglA.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-9d_FsUGFxv9fcwSpMMCIZ_lwkf0if1nxL29OAC3EvfA.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-vtzQBhKWsvei8zUSmTe14L4ypLM9wU0lg1FuLdiNpOA.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-3hYZ8bf23vP1ZdRG9ZhVJ39PZAtpFo3_w_tsCx-KSx4.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-H6kZbi94p0apN2zJPb0eJ2bWRsD_-Cfe0r6nrHS1GMI.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-qTVtTS_1yx0CpbVIyMlvgsE28Z4j55ZWwNSkJjdHSdU.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Uf6R6aYkp9Fog-LA_VLSOwdS0o9Y22gV3lx7Qq6dyv4.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-UtFBa-eq6EEi4JJ1WGoZEyeeGIh3bqPE9qtGRWof9w8.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2Dqq6HvdY35PJCvo9hnF3HGwVyavHUWpKBleZ_yRG8M.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-AKwRwbmaRt8TPU2kv_LxjT1Vq93cnleO80Xd1esTtSg.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-F4TB2ICBwAk7GBOhSaj7r0qGGbe_XE5zXJMTBHPojQw.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-_uYFink_9u5vySK1PYXJvMrrTDos522Cmy3whUzAFXw.jar
    Sep 01, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-TUDwcNWr64ZIOilw6eVBC4MAEblMu9qTF8Stu_yuHPM.jar
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-dY7qIvG7Pl-24GEbiSJTgIIS8vCU4-x1Eb2KnAdgqL0.jar
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-yvKifVXvwWQXxIIx1l3gfwzd7coXQkgNbDQz-GbfVZA.jar
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-cnpPN7dyJqtDTI6Qb6Fo8hzH680BpnfkBHK93lKV1MM.jar
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash ac8ff284e352aca33ac30478355d0992268a1c304b549b482feac89a32ec1639> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-rI_yhONSrKM6wwR4NV0JkiaKHDBLVJtIL-rImjLsFjk.pb
    Sep 01, 2020 12:47:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 01, 2020 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-31_17_47_41-6953424049927721505?project=apache-beam-testing
    Sep 01, 2020 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-31_17_47_41-6953424049927721505
    Sep 01, 2020 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-31_17_47_41-6953424049927721505
    Sep 01, 2020 12:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T00:47:41.178Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:49.620Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.219Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.410Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.448Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.556Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.585Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.620Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:50.666Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:51.309Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:47:51.381Z: Starting 5 workers in us-central1-a...
    Sep 01, 2020 12:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-01T00:48:06.121Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2020 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:48:23.253Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:48:23.286Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 01, 2020 12:48:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:48:28.657Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2020 12:48:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:48:48.299Z: Workers have started successfully.
    Sep 01, 2020 12:48:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:48:48.332Z: Workers have started successfully.
    Sep 01, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:49:22.596Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:49:22.792Z: Cleaning up.
    Sep 01, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:49:22.881Z: Stopping worker pool...
    Sep 01, 2020 12:50:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:50:12.831Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2020 12:50:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-01T00:50:12.897Z: Worker pool stopped.
    Sep 01, 2020 12:50:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-31_17_47_41-6953424049927721505 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b426898c-09ef-486c-a26d-20ccac33e80a and timestamp: 2020-09-01T00:50:22.344000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.891

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2020 12:50:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.068 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.066 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 55.355 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 46s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/34okduxm2hzhg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #939

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/939/display/redirect?page=changes>

Changes:

[tobiasz.kedzierski] [BEAM-10835] Improve Github Actions cancelling duplicated runs

[tobiasz.kedzierski] [BEAM-10837] Remove unused beam_PerformanceTests_Analysis Jenkins Job

[Ismaël Mejía] [BEAM-10673] Add public methods and validation to


------------------------------------------
[...truncated 294.16 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-D8z7WKmlq2p4siBIl76tjn_9fzVv_1_z5b9EtGHYhLE.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-UU0Y3C381-X8lb3_94o3tqm-cW3R3Yw_RnDSegh4o94.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cY43278TGfc9YaxMM7Chx9LmqPZFAbIUCv8THnKraGM.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-QKnd3JGRtiphS7GuTIW14Q96v3q6YgrGRcT9LkNYHYo.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-0ypL9842U1eW7Lc3xS2xwiCotW_6VQdNU0CWDvjTCZo.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-WovFw8XTQ0ajVI1meKNES6t5AeksM3C6GDzprRcuPrU.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-eqqlPC4UYPVKiZ3D4jypH0kQ3If5U_SHUYxZNYMcEE4.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-D8z7WKmlq2p4siBIl76tjn_9fzVv_1_z5b9EtGHYhLE.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-vBrQeEG-cmA3SnBydHTsNoTcqC5TghVaJj2WTJXGqjM.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-V0jIMZqq33zNNNOd67sTDwzq91SmjPZSLbqNG6y1W5I.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-ExpbxMvwW8NEVCiUd7RQC3HWAhyGeSbnP18vry-9kVA.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-4qV1ro6gkHUgN_oae9z7091xEgQrXJ7PKcKcumg3mY4.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-4JJLcN0STlpa4vOgZvAoMJQkQoBUPwm1CaoWQi-ZhT0.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XVZZGkgVVZR6VEgNEDkZ_jFKMnDbb0P6OuS0gXDXsUw.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-4ryosenmeMB8V4-rGUzMCwR4HpPsiXewHwcDmffpUTE.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-GBL_00zOnDfs442sYG1Zzaj-FvXHWZ_6iyH_X5CoivQ.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-r8S8IqvhUDNZsGiEzlvbs2e79V8Lx38POp7xKfj9BQI.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-zFy5pScqwI-7eCAFzwgnSi1L5ej6P1IhaznliZMpciQ.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-MFlV-mMjBdY4GWAp9wQ9NwKw1-0AYepz00ZY2-LtFXc.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-fYT-V1pTjo_lhoPQY-09HzLdmC8ifPA3U5CC2RTNaRs.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-dhFjiyGwsvxDfQ7WG7ALkF89JMXrPYjBeFk23Ejmy8Y.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2363219517964808656.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-IHFNmj4UqXrmefkzQiexTliQ3qoFigUfdsWrIiNF6kU.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-tBgXtlrt66DOcZ41TLhlB0E7FBExR3H7sspKhOa4UYk.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-XVzeGkj0cJsF-ETqwy38ZFAjMB9FTICdO7Owqs0olSw.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-_6QhP_lZvcutUdRO-gm0NcH1uOYJ8ELTe5j0ftHNEDA.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Sjgro3H15mKefvBDO85LRkKohMB-UKA1pwdpwLxqujM.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-25jpj-9GODzFwKjqWKtamEqkU6jSt6PtHCjxmNjWl2o.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-CtBY2l0E1NMhZln4RQJ1LwzTUmGPFB8MujMrIvtV5Pg.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-OaLmdCDKHplgPV4aoyfUUQOQngZZqsvHBxQN3HUPkX4.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-H0zgdthqylK4gy27032pgfQ_GGwJfrjkD_0B1Dj7Bxw.jar
    Aug 31, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-zr3MaWyYAfJLc13cFRhSXvTtut6CrsXZ1vseSySkFpE.jar
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 1667082461f22da653cb1e395ab8e606b64052ef3b65d06542a5b98e7ac4a5c7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FmcIJGHyLaZTyx45WrjmBrZAUu87ZdBlQqW5jnrEpcc.pb
    Aug 31, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 31, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-31_11_45_21-5227514071141882943?project=apache-beam-testing
    Aug 31, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-31_11_45_21-5227514071141882943
    Aug 31, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-31_11_45_21-5227514071141882943
    Aug 31, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T18:45:21.706Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:28.899Z: Worker configuration: n1-standard-1 in us-central1-c.
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.696Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.724Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.753Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.823Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.858Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.883Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:29.919Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:30.236Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:30.329Z: Starting 5 workers in us-central1-c...
    Aug 31, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T18:45:43.162Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:58.807Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:45:58.838Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 31, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:46:04.213Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:46:23.657Z: Workers have started successfully.
    Aug 31, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:46:23.691Z: Workers have started successfully.
    Aug 31, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:47:09.583Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:47:09.734Z: Cleaning up.
    Aug 31, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:47:09.811Z: Stopping worker pool...
    Aug 31, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:47:55.577Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T18:47:55.608Z: Worker pool stopped.
    Aug 31, 2020 6:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-31_11_45_21-5227514071141882943 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 04265d33-c00d-4d06-8619-49f68f60dbc0 and timestamp: 2020-08-31T18:48:02.842000000Z:
                     Metric:                    Value:
                   read_time                    21.647
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 6:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 55.53 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/sbhxmykm3e354

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #938

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/938/display/redirect?page=changes>

Changes:

[Omkar_Deshpande] [BEAM-10829] pass kafka header to producer.send

[Maximilian Michels] [BEAM-10602] Fix display of latency and checkpoint duration metrics


------------------------------------------
[...truncated 294.10 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-N87oTfvyYIPrMe9av_TZ6Zz2E6-fc_xaOu-acsXshUA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ZF2sPRz2GLm-FjrnRzamRoMuBp_lTYLHzA73qiE_-WA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-sMpVcQbMyelGrPXFZHBhifC15tw7QXAbZs2iAFG4dHc.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uc3DOHlpTs4iuoN5w9iJmbHGuDlQQLALSBgRQQZWdp8.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-N87oTfvyYIPrMe9av_TZ6Zz2E6-fc_xaOu-acsXshUA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-GwDji6-AgII4RCStuPYwRiCjtGWI9liAjKHrWcEw7eA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-HziCDM77thG5SivLgjIk86zmyfdUjpBjLvTiO6uwATA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-46Vp0MvqYdmtCIC1Drv_PKq1HdurzrQTNRwy38qOUgE.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-2DIXrNB-XtJtMsWL8PIeXktV4i37lBLzppWGDyAcAik.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-KTNLRzbn-XYEJM8brUqAlIVuNlXXxYEcLtn2hhjYmFg.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-LIA92P1UCCN5JksBNGjOkVPXbC-THr-QLaKXUQ7zwhI.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-ZsVOYIuEFwCLkWIlAa0c1RxVt2VYR7S041moTJxSzII.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-hyG8XBkTNdeDnV-f5uyQj-ebFmJmFBThdHxXIhPfNlc.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Thmpo7Jy9PbEYG1UgwsZq0jgQ98PKC08koK7LxiMjHc.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-96a1opoAKAFeY9RZgM_ij3ZBpZppgT9PlCD_6q_h884.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-NqZ2WK2u6OqW-xXQgm55dxBySmDh4_PaaKoRlFEEmwA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-n_0QOpUAwBkhKIh9U3ZIBp9X3XMjULHDQ0fmFN9ou08.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Ex0itfNhsIAl49GDm71VY12FX4oqXfeO9YTMMbRTNlU.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Xavc8z_aD6juaJimEwWRuP7oHd87UVWrkb0bAZLAZsU.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4796336939381203944.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NAUNw4G54AqRa4fvOKi5fdKBCAOJjxNGjDMePmg_xTY.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-58nKpJX388uvPTDV6UvxaD-uoUxhKI_XTGcUlkhXFGA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-rhKOHrhxd-09AxHuNM54qt_ZgkFkdxuSEBcKfB93fCg.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-XZsN9id46Vc81CNhElGW-RsyCKmzBPVA8vttJiDVRbA.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-KTr4UuFuoRq1Vlf4yPhxTtt5jNNySRSQHXRpHqbMV2A.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-OqOz8Id3gSZjVNmDAP4TSl7wtYubhFA_C2htQWAqcuw.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-In_xn8-9SG5dxWFuwQ7vtPLlcrH7b-kuuE8xVAvPXkk.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5RnDm-kiM3FdfIMCIOBEg0r76BhTfzHzJ9iSXW0TAew.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-GjhAWNVbJ9vErveIbye4vlQP3kL0UolP9eCt0mdCZs8.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-qSYSph5I4C0niXYMSJvaWxLkvyuFfGk3O3XNhuxJABo.jar
    Aug 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-g-L7US5k-jFj6LX21uxXk0q2ABMu-Wrrjhxwzeg-bN0.jar
    Aug 31, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT--OAy9cG5UhJEuPpnhihE2dFsm86BuQCPMohoLBV4CxM.jar
    Aug 31, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 31, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 41e04dae43daa46728dec168b55756d904300bf138c90dd231ec8b955a4b21de> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QeBNrkPapGco3sFotVdW2QQwC_E4yQ3SMeyLlVpLId4.pb
    Aug 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-31_05_45_21-5549367559512709893?project=apache-beam-testing
    Aug 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-31_05_45_21-5549367559512709893
    Aug 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-31_05_45_21-5549367559512709893
    Aug 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T12:45:21.408Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:29.678Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:30.954Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:30.998Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.024Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.110Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.140Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.175Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.209Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.547Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:31.649Z: Starting 5 workers in us-central1-a...
    Aug 31, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T12:45:38.751Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:57.672Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:45:57.706Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 31, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:46:03.410Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:46:25.908Z: Workers have started successfully.
    Aug 31, 2020 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:46:25.937Z: Workers have started successfully.
    Aug 31, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:47:02.939Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:47:03.098Z: Cleaning up.
    Aug 31, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:47:03.192Z: Stopping worker pool...
    Aug 31, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:47:58.171Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T12:47:58.216Z: Worker pool stopped.
    Aug 31, 2020 12:48:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-31_05_45_21-5549367559512709893 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 331972ae-1826-4105-b3b1-970d6dfd0497 and timestamp: 2020-08-31T12:48:06.827000000Z:
                     Metric:                    Value:
                   read_time                    19.666
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 12:48:07 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 0.424 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/m4mfchwxyjxic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #937

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/937/display/redirect>

Changes:


------------------------------------------
[...truncated 294.46 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 31, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-x_04SGfx9Qb9sttHPEPIWj0OUeI-gx8nIFHdL-fBbzo.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-nPDKIPJyCAi2SbffX5p9s1aaq8_8Ewf8VE3X_PZhv7g.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4806103170144890601.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OjG0i46eTlLCZQUQcluwyfx2ZSgiuKHKkjiRae8zo84.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-FQ6tJuYJ1qWZ-T16WQ9XZD4GRG3iO1m9IH0Fx5-Lug8.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ZRDdiB2bWK2k0LTMh5SD5F_qFJ092uE3sZL13jNJFC0.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT--q39zYd2IeDZdS9dV5AbMvk40auJgl2-mPA1CSe0224.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Suku4cJByWIC2azWuxVaB0u6wvVj3L4Ty4UT2u6G664.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-qyFoMBuAE0Tz0ozYzDwctYn6PXbRVge5wb6fm1aCEDE.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-cynRhlL3S8jPnfYgH9tzidBHWhm6nxMDkWixLudI3us.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-x_04SGfx9Qb9sttHPEPIWj0OUeI-gx8nIFHdL-fBbzo.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RJ1mQ5bN0tpCG9Zf7Pm3Fmv20vxWyOmvNTVu9kpIEmk.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-lo363hqY4q-aDMtxCErUa83LtQ03LPELdvc0bEpOJ1o.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-ftGsScuIe8feYR3BUxqiNFQ8QOIEQxl0uS7Ulg6nN_E.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-eH9_QHcR4rR8ycxsTH2x285QHDOvFNpd3u7JvgIhvN4.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-qee1rQ7n568tpDbLFe02BlKoiskRa2Ojo7NkvEPfDwg.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-zoTc7TEOBhNFevcnJJfFYIiyMx7Cq-8sl8tL7hiW9x8.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-EQlTrkIvcOyz13vhuhVQewySJhuvZjKFz6ym6i8qT_U.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-YlA4Wua_pwQjJnThSwLfY7sjTokme7FeCzC_hYzC8dw.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-iJFrqHL3wqGALfhWLcl7q_08jVc5XTy1qLpK_BcKtVo.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-yLRG-0ns2DWVA6FbKKJmdAxCxUCjKvR-7Un9qUhufNU.jar
    Aug 31, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-29x8vRN4MSX7MvCPoqiqsjvKwnVqMH-q840Ee584hgM.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Umzx2qCAOTB3PWpTePGMoO5Fm7O_8neGxSDBsyykpkg.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-3TtR_jS0a1LQ8FtZ27_kM945LOv02zO919xr-fDkmcc.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-4787NT_m26gIc1F0-v0MoOqTQ1_C___HfF99RxkWKeM.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-YrOsexUfSAOFyL8nIA7eNz-yYIl7nSCuQS0pzJqw66g.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-MEvY7uHSnIfOpJ9Xmisk6JYsdFYbGlix42rioOzG8WQ.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-AUpAO_P5yuek0BmC2tbbhuqWsL8wCUzLgnz20wcy698.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cbKQT8hqpmkLAMlR-_velCDKcg_xITV3UCKFoRwfQF4.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-wA779X_Klk1H9xIkVU0ndiY4Emo3-oELAtNT822lvF4.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-65yCbpOqb9zmR3FPiQx8gmZT0hnRPKPUfwHYkvm8h_0.jar
    Aug 31, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-E2eodiBqOwbmx1L8hwfzJdoIpaNosOHE4eBVBWC3eY4.jar
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 888b409831bfe0253f3ddff43558b27b8ad916b08e885887e46db06f53fdca16> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-iItAmDG_4CU_Pd_0NViye4rZFrCOiFiH5G2wb1P9yhY.pb
    Aug 31, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 31, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-30_23_45_24-15343695916053471675?project=apache-beam-testing
    Aug 31, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-30_23_45_24-15343695916053471675
    Aug 31, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-30_23_45_24-15343695916053471675
    Aug 31, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T06:45:24.735Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:33.404Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.458Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.500Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.528Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.595Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.624Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.657Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:34.691Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:35.094Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:45:35.176Z: Starting 5 workers in us-central1-a...
    Aug 31, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T06:45:56.003Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2020 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:46:06.093Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:46:06.130Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 31, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:46:11.620Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:46:34.654Z: Workers have started successfully.
    Aug 31, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:46:34.691Z: Workers have started successfully.
    Aug 31, 2020 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:47:08.640Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:47:08.781Z: Cleaning up.
    Aug 31, 2020 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:47:08.880Z: Stopping worker pool...
    Aug 31, 2020 6:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:47:59.482Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2020 6:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T06:47:59.525Z: Worker pool stopped.
    Aug 31, 2020 6:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-30_23_45_24-15343695916053471675 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f65642b6-56e2-44ba-9f8e-e7b25dd45868 and timestamp: 2020-08-31T06:48:07.414000000Z:
                     Metric:                    Value:
                   read_time                    13.137
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 6:48:07 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 57.804 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/kebpci6g3zppm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #936

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/936/display/redirect>

Changes:


------------------------------------------
[...truncated 293.65 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-F1TO_mIbG4TZO39sUoRBqMNUyzE7oKmTivhmcaEZtgQ.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-g5ZAc4HtXJoYgW4bP74oAVG4Y-vpgrzIKIIu-L7MGYE.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-k83laDDJ240wzSXP75xd_DsoQLo_guTxfT0epot23uQ.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-cbMYusU-cK1KNYK3vFK01PaZ3xk_GU61FvWUakNReps.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-isdSncL9KEFZI4PlqngDh-7c1CRRM-XOvzantX2g2Z8.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CDdHR-zOgr60djWBbMZFZ85pGVq1CKpbIMdyYhRypOc.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-WUQmz4bj5oHjePfOAqqZSCU39Bp8Gj6ClJLYGjR1XzM.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-cUN8HuR8XnNjWdv1cA6QQhXsyAspRsvnzvUB2bp4pf0.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5868614089108410571.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-fOrKdNq8D6GrAgXRcC4GVBYgPWTD8GU0iK9k9U-mphU.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-jQTY-wmynnZ-vyzV4GbJsyCokO4M3phDVAZ29AfhFfE.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-KB4INR_9MPoVi9-2CsgMO_cvh7eMECa_1mtROeMc3_g.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-F1TO_mIbG4TZO39sUoRBqMNUyzE7oKmTivhmcaEZtgQ.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-n_W1hDmeJuW7goVpUMpl088Gf64H0cnDOOgzkf_mpJE.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8DShyqnCelOdc5w9PO3nAkaa19rAaY4biJfUBmFfa0w.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-sQyoI59rDKpEgaY6KVLCX9O253vPHfUkPeH0mjG7A88.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-9TbcIe1z_ozg755ZHJ6IBdKWwuOTe_v5LEgPris2848.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-F73TgC-1qZ60NkgAHZz9AI-_zowYmIH8P_mYib9dYls.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-VkWwJZ7cAtpfyaR4cg2wbJU14DAwynhWeE2fzbgtotY.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-lUWxawQFOzgXXnVjBR4zF9wAWo7k9wGapHPIX7ZDE2k.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-9QESA-wnw3FH8g90uQIHJEeEFu-LE111hryEqRCgaKM.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-aD-GXgJCTlw3wf1UFcLgd6_EZ0RfsgHQLWJtLur9Xm4.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-d4lE-KJ7gDa0SLMZwVZz5cy0KOk3wL7duGExitP6nzU.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-O5cVrx61tkC-X7LCszFrNgB97CskeoPCS-JQsq93ZdM.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-XoeCKYr74GlMkURTCDkDklAbMWV4FJ7YAc6eGHAl5wc.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-FHChufnZhQYEKbTwinK76k0wGiTFbh7hKJxC-vKD4PY.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-_bgT1Pq7UlRP8AyB3bKAuUem8ovD-SokDeUQJEQbqTI.jar
    Aug 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-XrkfTZcmG0drrTuNyxmYLFp9sWR0i9p8Lx1E2qbLsq4.jar
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rkXw5DKX2cAeFZ4sl48HFKhu-dfGCwla4jREHBmUNrg.jar
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-qHqwek7Jus1xTluylzwtHqlq1yrCN2CE0RK2WRt7fDI.jar
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Ruxu3fLXtOXK3Z7EkwqQkp-tyG6NTALbzXR5r0tCSDg.jar
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-u5LjjmYNmsuUoq5rGGP-NQGvAeFdEkEwQa8XztYIFLI.jar
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 179 files cached, 31 files newly uploaded in 0 seconds
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 682fdd06945fbefea8347bc86b8010f362c9aa4ee94b35f8f48a442a3ec8dee4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aC_dBpRfvv6oNHvIa4AQ82LJqk7pSzX49IpEKj7I3uQ.pb
    Aug 31, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 31, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-30_17_45_22-14038946904192971738?project=apache-beam-testing
    Aug 31, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-30_17_45_22-14038946904192971738
    Aug 31, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-30_17_45_22-14038946904192971738
    Aug 31, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T00:45:22.380Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:31.038Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.485Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.529Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.565Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.704Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.734Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.772Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:32.807Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:33.194Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:45:33.294Z: Starting 5 workers in us-central1-a...
    Aug 31, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-31T00:45:39.540Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:46:05.509Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:46:33.831Z: Workers have started successfully.
    Aug 31, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:46:33.852Z: Workers have started successfully.
    Aug 31, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:47:07.397Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:47:07.532Z: Cleaning up.
    Aug 31, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:47:07.609Z: Stopping worker pool...
    Aug 31, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:47:59.780Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-31T00:47:59.825Z: Worker pool stopped.
    Aug 31, 2020 12:48:08 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-30_17_45_22-14038946904192971738 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 972a3377-3573-445d-9cdb-76e6b4033414 and timestamp: 2020-08-31T00:48:08.781000000Z:
                     Metric:                    Value:
                   read_time                    13.038
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2020 12:48:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 1.663 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/s4nbbeynqvy5y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #935

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/935/display/redirect?page=changes>

Changes:

[ningk] [BEAM-10775] Added a typescript precommit job


------------------------------------------
[...truncated 293.27 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-KQ6QuLfTe4fjy1VCmBz7CjavZkXS51Sw28FfJyg9vyw.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-cRMbEXJFeTZoGxOIx1N3V6-EgnJRqmjOWCXrWp6Gm_4.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-r4KQEtE-6yBfQPqnp8vfXF9ucGquJIfawjlSXdtqK70.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-nEMN_lK6pChtN73S5cwKpLnyhKne6IEpO-hMEALVbpw.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MTrwDGMKQnF6D30K5b829_LPwZWWwd5SCOnHiJPUc1s.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-gfR0NbI6xDrMPD1jSaaQpNCrqzBYRnITUSzw6kkqWVE.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-6wlrkl04-zp_NH-QtDwVU5nOWBLQVswlB6KVHbSHcf4.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-KQ6QuLfTe4fjy1VCmBz7CjavZkXS51Sw28FfJyg9vyw.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cgP9UsEq-6QkaMuGPT8KiaXXIIaZR-qH9Juh5hTpWyM.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-qmbEsQpxD890FuudXx2wCnsePowq-AKXgpsYZNYKdmM.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-hbur7m8M6k3V9SgolKX-5JlG3r4odJTcJI9pCRbjXQI.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-C4XNzZPW68Wi6qUiVPf6TAE9aO-R3dVue99IL2-pxDA.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-dkdbLtxCe2U-ZoR-cBJLXeD7HHNHgGFvcmnv3y6KM14.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-iiVen2KD7qoZ7Zyk4-SKJ0eHMFm1WbJy50nJyt2BDvk.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3366539333598282153.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dHCyCIekjS7ZBSgsgawRKsR9lG5mcIbYqv5NbgJYY4U.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Nif9f7y4t2QFl9lha9owm4v6Q9EhGGaWmk767fKTN0o.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-NzwsOI8jlpR9gFtJqFmLrf_FMuuVKc-HV5RWuA-byu4.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-eYPhC0pI_HJFpRhUEPSViFSITlZVcix6CLrtlU9jMJE.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-bH_HOkWDZ5lnLzmiDxEQeGAGbFllf8zG5_aZ22YezGU.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-zKtpW7YMV-b2VbX_NCDE8LvM8wCvccPFeaUVPjgnZOQ.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-IeQ50qztq9FEzrN3ApmHiNSKX0nkdwjMZNosfczBYRc.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-W8BKe_DKqSpqRnghnPdKM8Z46ZMCoi6U7GCXdkTepUg.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-D64Z-AqfhxmRHCJWFIuj0S9HSiVvXSwS36JKPmG5LPI.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-NvE7jqKRZ2V4PU7cs-4dNQvo7gwsdAiIL4VE7nA02zw.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-XHt0Plk9MC2WQgiq6N108FaLRLY3eVLAZG8rSGSV4Y8.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-23sTIPblR7p2_CqwcxYvP6VdHOHKL35jmKk65WyQrWk.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-GO0OMniL7PCN6YF5k6MzZ8tzCqgG0XK3qQmYr3W1V94.jar
    Aug 30, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-OUSKpuppCM8tfMmncrHk5I0hk0_qB9lKCe6Ts9UDo28.jar
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-UUwV7ewrmGE1Vdroiv09VNYPXsxTnuiw8OByDwu2T68.jar
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-CbV5K0e4g0S5NddLs9pXjNTyijpXfWueyQTrsdqGzQ8.jar
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-92plXM6Jx40t5WlfYbqhrwZSXSsK6fsC3YHHMm6MDxk.jar
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 2c5ce4c836badb31af2e34bb25fb5b11ca1b083076478214f67941e4440cdd2a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LFzkyDa62zGvLjS7JftbEcobCDB2R4IU9nlB5EQM3So.pb
    Aug 30, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 30, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-30_11_45_19-16878698303687097133?project=apache-beam-testing
    Aug 30, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-30_11_45_19-16878698303687097133
    Aug 30, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-30_11_45_19-16878698303687097133
    Aug 30, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T18:45:19.158Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:27.303Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.476Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.628Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.668Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.738Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.775Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.810Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:29.846Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:30.251Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:45:30.327Z: Starting 5 workers in us-central1-a...
    Aug 30, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T18:45:37.625Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:46:06.966Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:46:28.197Z: Workers have started successfully.
    Aug 30, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:46:28.224Z: Workers have started successfully.
    Aug 30, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:47:07.867Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:47:08.001Z: Cleaning up.
    Aug 30, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:47:08.072Z: Stopping worker pool...
    Aug 30, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:48:07.569Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T18:48:07.615Z: Worker pool stopped.
    Aug 30, 2020 6:48:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-30_11_45_19-16878698303687097133 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0e1fa2ce-b690-43ab-9d73-ecbccca34ea1 and timestamp: 2020-08-30T18:48:15.564000000Z:
                     Metric:                    Value:
                   read_time                    18.418
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 6:48:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 3 mins 9.639 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 58s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/kpfedehocmzcc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #934

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/934/display/redirect>

Changes:


------------------------------------------
[...truncated 294.35 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z2G5HuOiy6DehpkSPtaalfQk7xRQYoU8mxupxpDyxoo.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-1IVpYKpB-e2SrBJ5DNHU603Z1CYYb_pJcQ6mTt4qW3w.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-X5KW6t_bbV2eKHNQFleFaLy9Gg8eoRYSH8B1tXzqTwY.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-pxhQWlDK6qXLST2Tc7Jt07eB9u5gx9WaWRAqGO55wYA.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-86Z4pn5lBi_kvhe3tLBQ8d7LiFiNW6gQ0A_vBZFdVWc.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MzvjT-3eLw9fDh1xwLIcfsUZLo_cJiptB5ZqiIsmzZg.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-FOGMPnDfIHf20GFOn4nrY0RM3q7MKbVsR_1A5zfOeig.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-q6_YmQItIawf-HEzVRD09Y9TVGjCMhJnSeshPzKwag8.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z2G5HuOiy6DehpkSPtaalfQk7xRQYoU8mxupxpDyxoo.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1005551812840646928.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-81ILS_tqqMHy4pw9kbJzhpU-pVf0hQj56kDs6O3zVis.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-arlM4MflnPGqyQr6dva8lsN59w1zR1GjbY8Fz_GgGlA.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-1rZSIhjfc9QbZkpkCHwevZE-pe_U2h7MjAzTPYXqUs8.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-JVMLATg5cATKQDEZAA7ZSBm1xWu6_RIFO94pUdNG4lw.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-47gaKrimSIWoDfK5IJzgxiFfnxvg_MlzvhGN-WP4U3Y.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5_8SHQAIioqliHB28Iw66F7EVF_lc2da22_Ziotp7kU.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-OK-jFpJa94nKDCUZawRTHBlhBizBrEFeKV476RVTP4M.jar
    Aug 30, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Y_E5Tg3wGkfI1H6HXaNC6hKuI-_pm_8FUIvwqzwKqGY.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-i5hYl3zv74s1LobOGFRaPXJs06B6-EgX_0D5nV7GfxA.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-9fX8To0k4q99G_5C9stHeCHlzqbKcmzrxAnJr6MSiHY.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-591OfRodumQM2BnPGKwQbJ4I5HKtnlQ8zs8j2N3XvQI.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hNI8FiMN4Rkj8u-MGqkn7jhZiQOX-nrKYvpkC6mCuFQ.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-trwEp5bRzhgY1YGNiv0XnDyWKk2PXjaODKUggHzQcwI.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-lWauRd58XvLyypXgGS59g8CSvZsIr36e56u2O4bLCm0.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-fLoG5rj2xDcmXC3Fm3dkiZiF5x8k1B5GCMBfS0X5pIw.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-yypf-D6-Y9PkDqwaGfUXLdYBVtW1vtuHF8CQATczMuA.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-IqEejNwx1ncLiQeGM3fEfp2JA_-SEMn40Fjr8nOM6K0.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-W007uwzvg24p9N5zE9TWHaPl27pebJR0OoZ4D3jojhA.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-FVPW1w1KCYpxDGpU1rCmT8_PcP90fGhqUiNu3GVAFQQ.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-RrDCinX1DMM6pR14s-bTzysUQ0BOqiA1OfUMXqJS98Y.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Lk4gZwHTsjzRJRUnCcT87bRgJE4iFBZFbYvnWYUaZxM.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-pRYLWA2kpylDdWq4tudayTkeR3BSKpmsmAyE6VSqLsE.jar
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Aug 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92061 bytes, hash 9ed5e33f0205914b788c5eaa9fde34ff270115952731b4976e1877bfb77c11b9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ntXjPwIFkUt4jF6qn940_ycBFZUnMbSXbhh3v7d8Ebk.pb
    Aug 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 30, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-30_05_45_22-1477734097790047614?project=apache-beam-testing
    Aug 30, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-30_05_45_22-1477734097790047614
    Aug 30, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-30_05_45_22-1477734097790047614
    Aug 30, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T12:45:22.394Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.072Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.676Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.718Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.746Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.824Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.844Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.873Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:30.902Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:31.290Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:45:31.367Z: Starting 5 workers in us-central1-a...
    Aug 30, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T12:45:59.845Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:46:02.110Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:46:31.466Z: Workers have started successfully.
    Aug 30, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:46:31.494Z: Workers have started successfully.
    Aug 30, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:47:06.889Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:47:07.042Z: Cleaning up.
    Aug 30, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:47:07.118Z: Stopping worker pool...
    Aug 30, 2020 12:48:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:48:03.964Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2020 12:48:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T12:48:04.015Z: Worker pool stopped.
    Aug 30, 2020 12:48:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-30_05_45_22-1477734097790047614 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2455b799-ff23-473d-9ce6-b0efde8d6452 and timestamp: 2020-08-30T12:48:12.697000000Z:
                     Metric:                    Value:
                   read_time                    15.987
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 12:48:13 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 3.111 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/qfmvfhnqimo3i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #933

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/933/display/redirect>

Changes:


------------------------------------------
[...truncated 293.35 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 30, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8IOtTrfF5QvxCV6nESeB1W-_MT4iFV8cGY-jt1kSfUM.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Px6-BFGPYVjq_vXft-o0Re3PERDo_df8SSK1XEyeXEo.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-UtJV3R1BM332eDTb2IF0l1IQ5TU7vanVV8UJoHtmq3I.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-pKpoaz2EdfyrYc-KRqCa34dNPuth-o3OOfmC3BEFREk.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1363745297878381292.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Air6XNBAAi40RV9dEOlalnU4JyNaQz9E_uzVtYvTCGo.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-eMzDE_em8FsL6jy9c4CY6fOCFDYcIVy1Hllf342fGzI.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-VOeA5EaWZmDtxReS5HjmF8YDYlcVGsi7kQDPtel0xR0.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-fD6OOh5e6xRuE1vxqDVYgMVXTecT0h-hb6LQHKzl8VY.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-oDIC0laDUDfgrUQAYh74ez_Wx_93XgDTWgGz8DkwVqs.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-rOlVJfK_WhGD21uiuRM7UHivV57JRoQriCj-PA9TRzU.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-3ZEutHO9hCnB5jYRUPrY3QTg5BZ7tL6aOMZlZhkC8qw.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-ciYaUfnAAclDLvnahXnxXJVL5bSJvHfwSEleuOMk9uw.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4ew87sYo4BVn6bkFxVffC4Ounm4T-7oH9k9ckP_ZjrA.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-xaajebppDx8musbD7y7VeZa40oIzBzgjxu9julrA8sU.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-B1oTUGN1ywZJbjL90YlYiJoik1cnya4VRkwD-wKDhhk.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Et1vkW5Rfm4t3RL5Yu2HFqwxj6Jbhsn-lGQfNo8dB7w.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-DCBLqb0_vAAOb9yLHXzN6X4CC2qAjDB3OdY_KZ-2Ijo.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-TWsR7VyMcEmf5tiMhJctx2rLaATSHlwP7C0l6zEEJxA.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-VCKK48aueaQvB4ecg-Ds9kIjwhmGWS5796MuPAE3JrQ.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Nx_QryIbYACOOpthoOlzIhKhG6iwypJ38v5WCRqwkzE.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8IOtTrfF5QvxCV6nESeB1W-_MT4iFV8cGY-jt1kSfUM.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Ku3OqPsx2QM46J29NSSd0Se8ojGd65DENqd5-1QfUBE.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-vmMO_aWO2T1yv647PhI62YqwAwvmicG876KmaPfyEAs.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-WJ8ES1T7z4RWIpQ6Wy0JPPdgHW05tD9OhlEy0Ec5cPE.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-xq6DZNBqy6N21vkQIiXgEvTTsicdMpy_ROzHjQso9ag.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-QUjWF1JvSIO-HC-uPWUvBp7hVDUNL5tjhmn_OdKB2DA.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-M3ev_V2yhDxwVYA-n_5wPpCC40na6N72Ams8ORsFF_s.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-bUOB5mQ-CPJtSW1OJPslykP6Dg3r1kwR5Vzhim205Ls.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-C8WFlH9lIR6QOkulMxQzRbbsNjr9L5mMyk4ZSqOQBPo.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-gwZLaG8RtM5D4bcaSSqfij8OchsDQoKmHxk9cGsTtso.jar
    Aug 30, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT--cY4biaG1xvz4bguj_rZePIiVSarGvbt7a77ITEr_0Y.jar
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 862662cff89521fe81acf1658ea47b94fe24f83ad453a4d7ef7105b39ff97048> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hiZiz_iVIf6BrPFljqR7lP4k-DrUU6TX73EFs5_5cEg.pb
    Aug 30, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 30, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-29_23_45_16-14293083165899332996?project=apache-beam-testing
    Aug 30, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-29_23_45_16-14293083165899332996
    Aug 30, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-29_23_45_16-14293083165899332996
    Aug 30, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T06:45:16.151Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:23.596Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.258Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.298Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.328Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.405Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.434Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.466Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.500Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.863Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:24.941Z: Starting 5 workers in us-central1-a...
    Aug 30, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T06:45:43.446Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:45:57.524Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:46:17.545Z: Workers have started successfully.
    Aug 30, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:46:17.576Z: Workers have started successfully.
    Aug 30, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:46:49.806Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:46:49.983Z: Cleaning up.
    Aug 30, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:46:50.049Z: Stopping worker pool...
    Aug 30, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:47:53.552Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T06:47:53.593Z: Worker pool stopped.
    Aug 30, 2020 6:48:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-29_23_45_16-14293083165899332996 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0580e707-53b6-4973-9271-7f5188c4d95b and timestamp: 2020-08-30T06:48:01.671000000Z:
                     Metric:                    Value:
                   read_time                     13.29
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 6:48:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 59.183 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/6mj75zbypvbim

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #932

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/932/display/redirect>

Changes:


------------------------------------------
[...truncated 293.49 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qdJsUVTSye7vjxsrQGsRClEHmmA28wpgziyvPdWVOlo.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-U2BFw4IfHrbz1fD_I1qPDNaoF_l38tOk5OC-R5Pu6Hc.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-r2KQmeVveBHh-21p__c3dPUvomHcU3Y2n4PhTMbHksA.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-xL_vrClYjO8qbKqfNgPwiuZCKHxYGd77QP34kho9tC8.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-UEXBkGoExsyR4RltgO-nz3XoLAnXcdTeWsKkib562KQ.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-wVP1EiO0jYeXpiUAtQVUVTkhOtXlCVCwGv1PWc9co8E.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ecRXeVhwg4x_HvXjnQI8cWfVGzZzqatsoxS3HfRFvTI.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-eut1ftpDurtbp3fpU_0Cga4MqosFaWGZwkIUuszyMKw.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-15_zRKbt0O89R5_8jGdd-hMGBWq8KJ_1khBEMWL_hQo.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-5mPXVfp_wfyBiJSr02v7AMsZTYoJj3XRQf4tP6u0JsQ.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hfn15KmrV9PmIEOVovcdaKmsl5O3udxubDODsyp_OXc.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-235Z2IDgu3WWAiYTt_93wOAElM3gzveVx3phqRfSAjA.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-7pvGMkOjsiZOBGZ4XUWmZiqV4mpCY345SVT1Y2TIoZE.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-8oIZtUIgrzJPGY8uT9laZQyirKPacDzmfBZ3B8rlBSI.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-6lDJv0r_sUvYCgAjufxkkQ7aLqki6VDpQ2XmT9V6LKE.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-xHgYHZhi5JS1aWSeNA40r_RhBBaFY4aJouy7OXAcsFg.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-O1sK0KQkfKBYgs-Xm9B5--VzIZ5O-Wmy5gvnJPCROf8.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-G_1s26Rxbj6aq5viWHfhKBfpQE84bB0CRtk-rXmaHr0.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-bFbBeYleeCFe-3LDIOokW5G4AlbbEOy5GNRSMyXB6EQ.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-KggwF8iHqzdRGlP70mI1p8IOWK6McXTg3GPsoGqXeDA.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2CFGqxVK2PAG978t7GwpPftWUIkAN9YvE8rUTzGClec.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-PfQ8ioy7EqW0itu7W2y_vdgFY5ZriYEDfLjUczayMDM.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-fTE8bCP9R2544WCQRw3cLdeMyHtRPv2SbZldsXUhjUE.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7735401124478259333.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-4vNEwDT1Aa2QCeFVWasgPQI_YdhnPDbEqCd2-ElH3OY.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hfn15KmrV9PmIEOVovcdaKmsl5O3udxubDODsyp_OXc.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-GjlgOrdLEkQKenhAvzCzFYpvqYLoQ5_XzDVYU5l3vlw.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-plAYERi4aFKgoCC1iCu-yM8yX_tn3r4G84H7GJABt10.jar
    Aug 30, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Q0gmmGIrnxN-OdAYABusXdHqX1VvRdUpXvI308yiAu4.jar
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Zf9nZHd2ZsyaPn5Ae9fkWTVfmb6aCqWr_jlBzh3itrs.jar
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-6MgtbEKGwJlnIrEEXV9K4u3fvmZemCdjXHNfxmMHqNc.jar
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-bOj3lwvNrpmlx8xLwg6x99TobrzfcyFv_yAjrrJNFTw.jar
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 1cff8aee19f7d8680571e04b40f9f105a1613f9c51f7a843db0db0cc0fd1f9af> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HP-K7hn32GgFceBLQPnxBaFhP5xR96hD2w2wzA_R-a8.pb
    Aug 30, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 30, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-29_17_45_20-1386734540516625291?project=apache-beam-testing
    Aug 30, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-29_17_45_20-1386734540516625291
    Aug 30, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-29_17_45_20-1386734540516625291
    Aug 30, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T00:45:20.051Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:27.131Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.076Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.116Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.153Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.230Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.268Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.301Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.326Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.675Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:28.755Z: Starting 5 workers in us-central1-a...
    Aug 30, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-30T00:45:42.469Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:45:55.347Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:46:21.328Z: Workers have started successfully.
    Aug 30, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:46:21.357Z: Workers have started successfully.
    Aug 30, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:46:54.539Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:46:54.683Z: Cleaning up.
    Aug 30, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:46:54.768Z: Stopping worker pool...
    Aug 30, 2020 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:47:48.058Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2020 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-30T00:47:48.134Z: Worker pool stopped.
    Aug 30, 2020 12:47:55 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-29_17_45_20-1386734540516625291 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 558c81b3-9337-4ec8-acd8-8b872dbad0a3 and timestamp: 2020-08-30T00:47:55.968000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     13.31

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2020 12:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 49.232 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/jz2l754vxjmsi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #931

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/931/display/redirect>

Changes:


------------------------------------------
[...truncated 294.31 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gzfoIqkf_BZcM4ryvbPLC_kqUWDMmwgyhGsZRTUPohI.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KL1h9ZldgIREobw0hJohQyoFDx0R0ulRCFbUawbeuO4.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-TGi5aBVuE5g7bOloeU4W4eq453TIXqCw7f1v-tdp2qw.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-xphAXyKaFvh1y4kyQVCfY1qFytp0GwG0LIDTJhIsjiA.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-2OiWvkD0dXqB3UFPqySmWMEf8-WLg4QeYvOLgj6eZhE.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-VGde6vsc277hX3DEGxzflqxGeE8mfqDlTTj2LC_mQJQ.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-yUOAyQV2V5qGkI-zfEtGnTMHta7o4PXI-EPZ_1s17vw.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-aXEk0Ciooem-TgXrQcAX-zDXI2AxjYNveR9HKtRanEo.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QMY5MMK5c9e57GiJqMhCiXfPXTNIJlY5Muvj4XjcwIg.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-dd7Crx3lhfbCNvaR7D1yhQn0_1U0lMQnQyXvpLiJurY.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-D9n4pq6stz6nDnK-8TvKeRnvuutMcPbk7i3-hWz3eW8.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8721943249641727594.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-219hg7QIDfA31A9lBymA3mixNJpeHu3BDY6oniYEqME.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0qLgKIpdX0beICUtwqZiqQz8P3vYRU9SkaW2nvDAqRg.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-_cVZhOYZ6qiUhjojMmpq-TMp0ys-Nv87l1tfAiDr8-4.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gzfoIqkf_BZcM4ryvbPLC_kqUWDMmwgyhGsZRTUPohI.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6g0hzqH-oY7T_vQkc5uXp90H4WaRNcdjDjxNs1tMXQo.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-5LiNEZURUfTWcOzR15sX3N51Mh_AxdnK_fgC4iYFI70.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-IrIZevPL7VwjyPA95AjelcqUR5oLl0jXfmHRVkR7n6Q.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-nFs5XljYLsPVq5iPhEFv1i1fZcQJfvmc3nMhEX0DAXg.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ESBP6VLby9M_ZMbsGWZAnf1H_13pK_of0FHqKgRB_1M.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-mMGoULEslUmv2rYGZuKOqYxsfvky8xCZZGaxURAi9_E.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-CeHxWewpY8JoW3EMHxqz6P5aKKMhgpxmcr12gxWpmUE.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-_W1b76a5AWKjYQLFY5EQlc2eVgS7DY0rgkZ0duB_FRQ.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-_GpqtFeRb--aqEmCFg_x6ZLX3zPssJ1Oe6HF5JprkBs.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-lBlxvlhnilmAM7kZMn-AUrsPraGlAF9YoSN40jBIukA.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-JBFK60MBpQnEyTTID5Zjp27e2BYcfq4QCefW7DQBCH0.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-QfVETIxgouE3E6mE9DKTroiyoNherBUGKS60j_KFTPE.jar
    Aug 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-RruZWgIu9ov7fIXBgTnM1BC_sj_EgSbojQl3cH9xYj0.jar
    Aug 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-lioReYJKlYTDi7hNzopTErFqXoACz3aV2gf0bZ9kghk.jar
    Aug 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-e7PbvSCN1RMd56yL8FazRWpEnCnqtJUa6ZrGk3_r0EA.jar
    Aug 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-JGfTzptMi1S4gf3_rpWJ0bmYasD_wrwXhIPHsDr5cHY.jar
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash c130b7b6d1b5ab25efbd509c6017b2a5bf4ddc2c0801bd83e65a6bd81c173686> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wTC3ttG1qyXvvVCcYBeypb9N3CwIAb2D5lpr2BwXNoY.pb
    Aug 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 29, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-29_11_45_21-4409050556513517229?project=apache-beam-testing
    Aug 29, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-29_11_45_21-4409050556513517229
    Aug 29, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-29_11_45_21-4409050556513517229
    Aug 29, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T18:45:21.687Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:29.289Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.048Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.079Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.110Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.187Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.216Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.249Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.284Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.795Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:30.867Z: Starting 5 workers in us-central1-a...
    Aug 29, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T18:45:46.852Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:45:58.380Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:46:21.767Z: Workers have started successfully.
    Aug 29, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:46:21.804Z: Workers have started successfully.
    Aug 29, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:46:57.221Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:46:57.378Z: Cleaning up.
    Aug 29, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:46:57.458Z: Stopping worker pool...
    Aug 29, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:47:47.016Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T18:47:47.055Z: Worker pool stopped.
    Aug 29, 2020 6:48:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-29_11_45_21-4409050556513517229 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4c376d5a-6850-433c-b2d5-46e1f6bc538b and timestamp: 2020-08-29T18:48:03.107000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.644

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 6:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 56.909 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/uku7wa4nboioc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #930

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/930/display/redirect>

Changes:


------------------------------------------
[...truncated 294.61 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pj9g9fgR1T8Y9VZ7Av_gHQJWR9BWHu4YO6_FrVan_ME.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-zAmQH0mJrGrtGH9rFs1kvnD4ZmfYmaz2IkF3zrZoYDg.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-R1OuM4__TMCHrIKKSZZXB7jkXh6Zm-8lhrs-E_A70AM.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-bYfR6HXG2KsTLyRcL0jaIjUWbK0z8YOCwNQRpqIGi1c.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-GMgSXM2EsgE-MOpX-1j6e5FtAQlExWpRBqlbE_-h0yk.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-pStjR6EXiHhLwK-794345e9YZHUHX22-AAv0IToc5bE.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-KuynVwAIuGHR06iVuEo_mpsIZFbyiyxWHFK0oQF8P2U.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-GU5Gyv0mWzV7Jk4yxd8JUFB_fLelUhGVifX-xFwPugw.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-6N9kFZ2AFeCYB1IrYN3QPv_H1sYO3fbjhrxpHpCZ6dk.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XRTd1yDywDOoXCDwUnAdcL6MgyfZljp2PrDYV8VmsLY.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-_-1EEMM_8KmzIriwhMnr8pAOPK-sCQHJiOmvzIYRL4k.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-z9FQYqKDlqyBnroQmhqPzss9ELgDsxC6iH0rXNChCM4.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pj9g9fgR1T8Y9VZ7Av_gHQJWR9BWHu4YO6_FrVan_ME.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-RLtO9yOrGqBHKw4De4eIKU3ad5-1YvEU8XMVq_xdTQw.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4680798788979333854.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OZAOjfHsUUthjYx9KOXHY3kyZUOUYj9M-g2D-CM8a0I.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-ui9CICnYDlBvMBUfe_Hy4qMksXrQInK_Mj4WJhuJkTs.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-q3HQ0DN7jniQrYfQ8cMp1cikAKsmOcs06TH6-OewBns.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xW6SpfTodAgbNDaYI4AMkVxR7AwFYDeQ8EIklrcHNV4.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Yb5q4menz1Y2icUbEr-kKx_V3EdNf89S6lnZhIvRP0s.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-DWd_qpbgKUZgJ0k0_nKGfyzONmSQGqs1ps7D-AzwVpg.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-kQrI0MQ-ukTxUDzzntEhfeqaDCv8jUkdqgRYe62vVFg.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-aov9qSH4duQZ9lXD6hk3iDLF25sECkFcxbjueLF--zg.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-QzrAh7wJI8vJZ2booyVNfYhoMNF3eJI4IpCtwMoaLss.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Pcl2hQm2WL5-LLwRUQ58Jh8U6zZuZcPSopGulC2-fUk.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-wsQa4z5SqxStC-bwZfOGW89i1RMGQYqA94gpNQbH3KE.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-QGx1qV8tW9oz2BC-NzRmfaAs68qrTfa8bNI7WWVruM4.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-lMKfDqH1A2ucWQUxqVRjao0k7IVDaoUJKDpYKzHsIzk.jar
    Aug 29, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-VbNts9bNzAhCmEdBhUnwlD4pCty_lKpcXU3hzGlq2W8.jar
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-pjesbprWmR-rwbEB__pLpzLL4vxC31hExfrB1I0oDgQ.jar
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-6qZ_ErUpD4A0U58ppip9sC7YnR7llc9qcu9u-x4RXzc.jar
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-55aSuHYw1ujrn7AyU1TmsSmmgxS99_zClNlxxaEHtZc.jar
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 11f028735a2e7a9e8b6eb13fa6bdda5f72bc78fb829f4b7ecd64fdda3b8c6512> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EfAoc1ouep6LbrE_pr3aX3K8ePuCn0t-zWT92juMZRI.pb
    Aug 29, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 29, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-29_05_45_18-16634514923477510290?project=apache-beam-testing
    Aug 29, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-29_05_45_18-16634514923477510290
    Aug 29, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-29_05_45_18-16634514923477510290
    Aug 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T12:45:18.370Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:26.109Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:26.770Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:26.812Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:26.883Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:26.959Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:26.993Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:27.022Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:27.060Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:27.392Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:45:27.470Z: Starting 5 workers in us-central1-a...
    Aug 29, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T12:45:55.931Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:46:03.655Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:46:26.160Z: Workers have started successfully.
    Aug 29, 2020 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:46:26.188Z: Workers have started successfully.
    Aug 29, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:47:03.608Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:47:03.781Z: Cleaning up.
    Aug 29, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:47:03.869Z: Stopping worker pool...
    Aug 29, 2020 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:47:58.767Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2020 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T12:47:58.835Z: Worker pool stopped.
    Aug 29, 2020 12:48:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-29_05_45_18-16634514923477510290 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 77f233a7-7d0c-4453-b492-96cad6b55ffa and timestamp: 2020-08-29T12:48:07.751000000Z:
                     Metric:                    Value:
                   read_time                    18.536
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 12:48:08 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 2.763 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 51s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/d2c3abydj6sjg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #929

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/929/display/redirect?page=changes>

Changes:

[srohde] Remove unnecessary limiters and add a method to get the size of a


------------------------------------------
[...truncated 293.72 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-X_A485kr6LlL5R-iGeHxB-uwfzyEJAeBxT9fntudA4o.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-SCqwn1R13zM_s7p4XJQmPTrHNC7uIzulJPl8dhpk6m0.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-8ImuW1fOtKdRuAEY0fLgz9dtsszMDBHrt8-QL7SCFqI.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-MvY6r5B4K2G-ZVYFoWUzmHpxJx-7wn2_WR1yP-pvMKE.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-5oxeTOMcaVzRpNkvtC-z8Wc0axAv7W7D-xZPm9ZtNjs.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-z0n38ROm1sQ3iDLc8rlJ6ZJzLQUTBxiVXTcATPRyM18.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-mrKS2RWVWWFESkOQ6mVzcg3DD2aV8idJYT729TE5YdI.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-y7NgXd_UgpwBJBu7yavOjAaqxlVJsLAyiYMwqy3PHNk.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-u0avnZC2lXDIDm-xlXIXgXJIYcPesl5A1ETxh2oiRxY.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-i6W8bvvUdMqjShp9100efFzvez9LRYsuotv2jWbfQME.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8342010747658198423.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-93ir8ZWssDCxQV7Zgi2OppzDoQPXyWm4wF-w1Ot2p88.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-UWzAQdVPSJRXOUNOcmz5up6Ad0I2eJEmERlTZZ1cO7s.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-GdUPUHgPyTKkj1JY97cXqrWEGXbkUBlPsbL1nSClaT0.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Qsctqnk-r5kb1My8sDOq_JLMfVDjfLV6lJ7aHGL24D8.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-rgM84Sr0RdM-Oq_F_81FDNUsJXPBuUdKEsO2ZZSNQFE.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-eZx_jxIn4uId72UHsvf4GC0mOi6zxwZl9m6I94nCyQs.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Nrj1wnf4_Kl4TTWjiAn2DhLufuP2sdxP68bHu1BNrpM.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-67WZ0w7XXP3esNNhcM2hUcB0AIDLS74b3yHm2_vjdmY.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-S36bIXorQPnMwNfWxJKcn3MW0OYilEiwRQUKxO0guok.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-GtDwncEZ-Xs-ttOdUh5-kp0-4T1xc3B2D1-u_w-RV4I.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-iuxfRkOJRF61uuC18KHMC-73_GSmDcwXu2_LH5SdJOk.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-X_A485kr6LlL5R-iGeHxB-uwfzyEJAeBxT9fntudA4o.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-eEZiDLYuglHHKQnbr_Puwcx4EZmoNM7EqW3LRnKLbKI.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RNexkoDgIT4XobHKsjAnF26LizYPcpwgN59-FYrTju8.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-uTSndgBbABEOO1MDjJ_OXPGRysoue95dZtqsEejQofU.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-qvSyUGpOzcpzqoP7ewaBZpkr7HVHlZ5scfVKNf4ek7w.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Zb0--nc40pfB_HG1T365Ir6f4Dv9ohTjxNURMXAb-s8.jar
    Aug 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-4iopvXP-KZ6b-AkfMn85DvSPWoiggwXaUHFL050F1c0.jar
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-TQTCq1tPNxf2mH-Z5kE5qsbfO_sIbUIUCngnUdVdu0A.jar
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-zWL2wT5JlSXLmIBtUfS3oTDFVlRukvEldal9HQKpGFE.jar
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-NWPg-ATS3b2qsxhiiqts0u2__yTRtr5sFDv8dySOcRc.jar
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 5f83daa7570ca52940ce7765d28aeaa75a66d0b1f8d196e561448828f0257f2d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-X4Pap1cMpSlAzndl0orqp1pm0LH40ZblYUSIKPAlfy0.pb
    Aug 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-28_23_45_18-15151293195348860736?project=apache-beam-testing
    Aug 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-28_23_45_18-15151293195348860736
    Aug 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-28_23_45_18-15151293195348860736
    Aug 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T06:45:18.084Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:24.891Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:25.816Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:25.867Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:25.907Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:25.985Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:26.014Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:26.036Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:26.074Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:26.487Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:26.579Z: Starting 5 workers in us-central1-a...
    Aug 29, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T06:45:41.435Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:52.428Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:45:52.463Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 29, 2020 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:21.671Z: Workers have started successfully.
    Aug 29, 2020 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:21.710Z: Workers have started successfully.
    Aug 29, 2020 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:29.537Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:29.577Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 29, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:40.273Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:56.816Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:56.958Z: Cleaning up.
    Aug 29, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:46:57.033Z: Stopping worker pool...
    Aug 29, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:47:45.080Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T06:47:45.127Z: Worker pool stopped.
    Aug 29, 2020 6:47:54 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-28_23_45_18-15151293195348860736 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cfd4f469-1781-444c-8dc0-0960ea01d0a0 and timestamp: 2020-08-29T06:47:54.542000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.227

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 6:47:55 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 51.389 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7bu5gwkvxzxtc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #928

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/928/display/redirect?page=changes>

Changes:

[filiperegadas] [BEAM-10649] Add BigQuery Avro logical type support on read

[noreply] Linked SchemaIOProvider development guide in Javadoc (#12720)

[noreply] [BEAM-9680] Add Aggregation Sum lesson to Go SDK katas (#12694)

[Kyle Weaver] [BEAM-10825] Account for multiple Jenkins websites nodes.

[noreply] [BEAM-10831] Fix broken Beam Dependency Check Report (#12716)

[noreply] [BEAM-10812] fix(beam/sdks/go): fix buffer limit on textio (#12683)

[noreply] [BEAM-9680] Add Aggregation Mean lesson to Go SDK kata (#12725)


------------------------------------------
[...truncated 294.31 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hVFPoZdbAU1cgp8DMz0GZmq0cZENmv7vhZjlo9NV3SQ.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-gYhgV9LJSc2MQokNKG-OYwUCDQcOFMi2-s9Eww4k6ek.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-6E8JGeukKP7AgDHa4_cp-P-MFWz6L0Bg8t4K2gnzDxc.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-FWGW4X42-xZk7iMerWzh3NgoQNxifEvPzDdTlIsfMR0.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-T3f0SNBC-IF5kTNmUgZ88ExGXL07PW0bGk3_zPzEOJ0.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-2aEwkXk6LWLURlIv8K4ZGdo8us1JkjFl63dDMg100AA.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-t9uZVPoEgUXPXQK6ZHq-HGGp0YWcYWINvvX9b9iQ-XY.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-0nflVJwKmZyvIFzAtSNT8yxX-duTjKo7Yi2CXjicl40.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-f2xB84vUdR43pKs1WwpslOh1aMiDRhCv5vpcFBEuBLM.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1752837842891094410.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-CTgMBUj1D_Y7Hlji8xiWXKbVQs7to5mL7TIw0l2dcCA.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-KQUMZS_KxXEW6AFQC--0jO0G42k0zDpIDrFv3hiTg6M.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-xZQb62YcX42adkkVMd4a2oyTdtqmnrVlh31hCfkNzB4.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-t9tCuFx7GH24FYqeocD0JMgIaZdU8eeDYcoti0lmbIk.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-WScXBWu7_bLKWlZJPu_a7BcMAZqgibD0Ou28-Z1b1Lc.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-tHAm6UYCoVkp361Sk95Yw-HB4fE_INT28FLd9gU4kVk.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-uu6gsAlFhN4nK-viAgbjjgUwtJLrhje5JhxGWjElogI.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-lDnE3jvh_d65up7V_P86TjyWd0f0Y89b8lbYAFIIzAY.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-waRqeeVuH6rDkNYRDsvAnO2HG0_3LI9tx-vKgUR_5Zs.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-yl3pWeesG4VYpX2QkZ3JwHSabJHzreKSTBDla_S1fiE.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-sAL1M_vYE1C2XLIebXaNAcBPrrkPG-0PblB2-OtZM7Q.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mF1DezCHuPJp4XFuV_UhZLknQGFH6Y1oPOGWpJB_hQs.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-WVu5gKn9nlJbFJZa8ClEdApZpD05M41023sN9bVZec4.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-uYgmRbukh1SE6wLK4Y9QibjtrHCjIobxJgdfbrsmvEs.jar
    Aug 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-lmC9LBteBZj-spaiGacH5oIoWaVRCgAQ1ZdxDOPFHA4.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-04GzaSr4VrVoM-ifCJCccVHI_phK6LAdKk61h_bo5Mk.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bEoHAIxWg5davP0JapB0AI1kWWykWbCADnRbM0x1mdM.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hVFPoZdbAU1cgp8DMz0GZmq0cZENmv7vhZjlo9NV3SQ.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-54zFktHsHEe2gfCsGanyPM7iNqzstB2gCXE9ho43LIQ.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-T7u96fXBy_1Ifzz_MKhQvkKlbP9AzjiyqARRo5xBGUM.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-68JEuMQRmOsklnJWHKczTnByTsJ_mLNpCuqo-6f9Pls.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-J5oA-xJWnMal8jJGVZb1ElCXE7hWdiLaUpf52fmhn_w.jar
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 29, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92061 bytes, hash ce6a74fa1882744067dfaa65aef99ec4280fda5d7a3b1f83c054760cd9cd250b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zmp0-hiCdEBn36plrvmexCgP2l16Ox-DwFR2DNnNJQs.pb
    Aug 29, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 29, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-28_17_45_33-18114539774706711483?project=apache-beam-testing
    Aug 29, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-28_17_45_33-18114539774706711483
    Aug 29, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-28_17_45_33-18114539774706711483
    Aug 29, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T00:45:33.409Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:41.173Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:41.819Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:41.860Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:41.893Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:41.955Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:41.994Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:42.030Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:42.064Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:42.379Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:45:42.454Z: Starting 5 workers in us-central1-a...
    Aug 29, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-29T00:45:50.049Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:46:14.194Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:46:14.225Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 29, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:46:19.553Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:46:38.317Z: Workers have started successfully.
    Aug 29, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:46:38.352Z: Workers have started successfully.
    Aug 29, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:47:12.189Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2020 12:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:47:12.351Z: Cleaning up.
    Aug 29, 2020 12:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:47:12.431Z: Stopping worker pool...
    Aug 29, 2020 12:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:48:07.070Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2020 12:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-29T00:48:07.123Z: Worker pool stopped.
    Aug 29, 2020 12:48:15 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-28_17_45_33-18114539774706711483 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d637b84b-95cf-451a-82a4-bcc8b4e207d0 and timestamp: 2020-08-29T00:48:15.735000000Z:
                     Metric:                    Value:
                   read_time                    13.777
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2020 12:48:16 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 55.247 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/v3nk7ezdfxl7k

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #927

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/927/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10409] Add combiner packing to graph optimizer phases (#12185)


------------------------------------------
[...truncated 294.89 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-FZPl8N-bK9w9jQKc2ni8RFNlzkNyc-DL6UDUjCmuhfU.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-5bLKv2u0CqiF9USaTOpV_8S1HzZfRWdqaPVAJIf82pg.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-g9AgiwuaWzDwccV-62tSn5jdkFQzVwohkblQc5ShhkI.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-jpsCJpHArC0p_4qRXscvWTMBVgriAlMvkqAFbI6qkR0.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-xmLwg2LbHABtWao9pWi_EZuRq6Ibdb6pOq3Xxr59qJ4.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT--oltyekBPAq21XsoJIJvQKsj-YBd5DZ94otQ31x7qdY.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-2tKOJjvlRTjGzr1USWAOYY8ITTkg9GRHhBtmqZZZPBc.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-SIUJgrlN-1PgO3OXVE2s2SLBNDGZsvKwz9IQc-Vpemc.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-FZPl8N-bK9w9jQKc2ni8RFNlzkNyc-DL6UDUjCmuhfU.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-JxUCbDe-hGWDA91Odlvlv72Db4fI-spkbfL561OwcWU.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-nCsuQ6I55z9Eey7SIeOXWKAuA7UbrMnOGgJRnmnbPsA.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-W9kSoIggWepQhJFMGzeWO5SK6repozGhOQytSx8yem4.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1010429458753734118.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Qhxh_h2BfPe3wKzF8JMnn5L-yxQBK9ee9Q0bnIyLXuU.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Fzs0T6dem6Iky9mVMNi-m4BMrYt4Q--iWNQOLaNXBzI.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-p-zM5he96-tMXi4SMsPdSwGQ7WagramN9hJaxc_puF8.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-eIDrJP-RIXmzjxK-Nw7bLO5YkCDhPk05kiNm0YKwmnA.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Ter7Jq8RvaPTeucWvRVQ530RqXpdFf5LtAB3UFYrCvA.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-MYC0mIC8Eh4wwjIlSGtnCu5eIHuwsdzEBr1I1r_1k4w.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-7pT8WMFmpjQwqW_Ywy4Nxbz2x6KkTXDP2DDiZfMJ8No.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-o-5P1s1dIeMo0nFXMk5Ekgfxaq48KNSlhhIEAHuAC9E.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-rYX7iBTxDV-VCJPh2sY3eT4gyKDKc7u7LRKaE-iUU_0.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-0OShHhfLN-5ovvjetUUBlCcF5OTQF6qzREOCSaeGEVg.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vz5l5cC4vZa8cpj3cp7ZfAfq870S29vpEcB4TjJGh-I.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-kP5Qig89VOwEx0gwx8buqXyWwdovXWkk69quEWx6G2s.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-aCkLl5NbnEvG4lCmzDz6Ouz2gOTzvjig4s83ri0Xiyc.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-3WwSkZljkU1Bzi4sM2Vcj2rJTc3opUlFchEJMicwltA.jar
    Aug 28, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ZCuoxJzOBBoENJNx8_1ueT2HNp9idglV051iAaTChTs.jar
    Aug 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-bCfHlf53LkEGO0al-gb6SFGtZpW4bpUX3cmhI21lZAk.jar
    Aug 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-qqHzeNgwgZxOWBiC_EEbLdvgF4OTK-cEW4TcwPyJvYI.jar
    Aug 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-WeHIVsCxsieuTgWtMp33wt9_I9uIl-mjwuqZk173drk.jar
    Aug 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-cbOl9pl5a-MtoIcizjxeLiWTrHYgDpZ7ddxAbH58Ms8.jar
    Aug 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92063 bytes, hash d813a8ce5a12de040f4b4e2b67cb1aec842804ae9301d146ae784494371c106a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2BOozloS3gQPS04rZ8sa7IQoBK6TAdFGrnhElDccEGo.pb
    Aug 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-28_11_45_23-5891980816012111068?project=apache-beam-testing
    Aug 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-28_11_45_23-5891980816012111068
    Aug 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-28_11_45_23-5891980816012111068
    Aug 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T18:45:23.545Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:32.435Z: Worker configuration: n1-standard-1 in us-central1-c.
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:33.916Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:33.966Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.006Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.111Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.160Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.195Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.245Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.718Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:45:34.792Z: Starting 5 workers in us-central1-c...
    Aug 28, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T18:45:51.047Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:46:02.865Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:46:39.514Z: Workers have started successfully.
    Aug 28, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:46:39.586Z: Workers have started successfully.
    Aug 28, 2020 6:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:47:25.524Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 6:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:47:25.758Z: Cleaning up.
    Aug 28, 2020 6:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:47:25.997Z: Stopping worker pool...
    Aug 28, 2020 6:48:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:48:17.705Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2020 6:48:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T18:48:17.746Z: Worker pool stopped.
    Aug 28, 2020 6:48:26 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-28_11_45_23-5891980816012111068 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e1dc741a-b80e-4ef7-a801-6448d565bcc3 and timestamp: 2020-08-28T18:48:26.897000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    21.364

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 6:48:27 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 18.505 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 11s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/pcw4n2s3ufu46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #926

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/926/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9898][BEAM-9899] SnowflakeIO.Write for cross-language with python


------------------------------------------
[...truncated 294.16 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QEluY_1eLzsCgFsNYjJlwpyCufTN-KRGAJf7mqbHYiA.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-zZU9UdccPoPswDbZ8Ow7UyYPSY-6yF1p5PsrW5YpXjk.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-a4yqUc7pXz2vOflRTJOfOhChA9UoL1y3OtenqKg3xPc.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-0X187VxX_DHAh6n_l_Nz5MdnxUcCbxAQY3qIo6JqCfo.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-1DyPX79i0dcgg81ESYcwt4v8ITjaQMz6Dq8c4cvIxIM.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-F0UaXIOp02Xrb6-4FAHIacXIQxf9bYiz6KWHc1XXO8Q.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5830114343664718153.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-QmC7L-9T5NmGngg5Mlv0Pes391MHf9BxNmdp717Q0OM.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Qm13v3coPk5M1WPD031ysN2ghVl2DyqbrP-qY6U5rXE.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-8NdAQCcnbjk0qHAv8BLhRqXKC_HR3z8-5DvPigdoYbk.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cVrpLVlUO3_zSiqj0F7PGfkOHcPVSi_FoPrFfNNUGeg.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Yes20vOohJQlA-AlbCFmtOvd3lYutcOsNCdkKId8NL4.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QEluY_1eLzsCgFsNYjJlwpyCufTN-KRGAJf7mqbHYiA.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-sFDJilbRVoX1wCgKL4LtdZVBsEJujg-9DJQkgZxD2-c.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-1oXrNA-Cz7FVhl64_HfUjcu58v9Osrxmu7Ilt_LdyrI.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-k-4x3-kffErH8mh2XWpdQXkYNWp9r-oC2xVicuQPRxg.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-4mxHnqwKkyn40B32yW4pf8_2KVXGneMwQHhpcXSclIE.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4ztJDcvMJ18GFgxkLWctgN-WvmDxUbv8-7Ac-5vKAv8.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-UehqLDyQZTLTVl2H2teqlKV-E3IA21XK-2F-1bCZ3JU.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-87icPFyjOcxj8rs2ven6LCjpwNIeSN8wCoVle_Tqu3A.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-yj_LIqLp8kP1-oeHJrk0KadMjlHIRENcpyPG-fhIQIE.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mDNljBz26f20T7VnkgI9-tP-1sUBau14XM12_497nBA.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-1o_uYrvWQ9aIWe7eobugtKuXkXJpFi3t2QSzbu4pGFc.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-7HQZrCMLNTKVtg2bGDXLomHpR6nSKi7Z3wt6qX57LGw.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-cbeuEbIafYSyKW2vnxf47wha4jhz_NSf0sh14IYIqyw.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-FGeoR3Jh0JBrtrUA3NzlYJQCKIuvH3BCHlTJb9izaPE.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-qIejwjC6Zyz8H8YvO1IQI0sT8N25M6n4da5_CWvQDWg.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-mjtHlP59KrnEwolvDLmYkHnw77LnhvAU4yi1HxPyxxc.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-1053q0RlPpQ__N0EtWBPlBYNtolTZNyonpleGGWueL0.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-T9gRZZg9R6DLqtVHIeaMbCikFZsIFs2Zg80DtOw290s.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-nM6SyxZVa3-lempWOa3NoWBWHR1M0j37YyHMRLOhGQk.jar
    Aug 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-RogZ70NVSvQBvmIexVZ3oMu3eZQet7zD7cSHgcQ5f9A.jar
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 55ce9465f0dba289e33681d9117a2857cf9956f0b8a7f361ef175d9285560235> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vc6UZfDboonjNoHZEXooV8-ZVvC4p_Nh7xddkoVWAjU.pb
    Aug 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 28, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-28_05_45_21-1287803059132275658?project=apache-beam-testing
    Aug 28, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-28_05_45_21-1287803059132275658
    Aug 28, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-28_05_45_21-1287803059132275658
    Aug 28, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T12:45:21.671Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:34.591Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.209Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.257Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.282Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.358Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.393Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.430Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.457Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.912Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:45:35.996Z: Starting 5 workers in us-central1-a...
    Aug 28, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T12:45:49.049Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:46:08.845Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:46:30.364Z: Workers have started successfully.
    Aug 28, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:46:30.393Z: Workers have started successfully.
    Aug 28, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:47:01.663Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:47:01.823Z: Cleaning up.
    Aug 28, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:47:01.914Z: Stopping worker pool...
    Aug 28, 2020 12:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:47:57.415Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2020 12:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T12:47:57.452Z: Worker pool stopped.
    Aug 28, 2020 12:48:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-28_05_45_21-1287803059132275658 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c92d8dda-be25-4d45-8fd3-8f7406c83a4d and timestamp: 2020-08-28T12:48:05.219000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.159

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 12:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 57.094 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/iphugeepfhsss

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #925

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/925/display/redirect?page=changes>

Changes:

[ningk] [BEAM-10827] Fix forward the google-cloud-core version

[Valentyn Tymofieiev] [BEAM-8551] Verify that installed python packages don't have conflicting

[Valentyn Tymofieiev] Upgrade TF to 2.3.0.

[noreply] Merge pull request #12492 from [BEAM-6807] Implement an Azure blobstore

[noreply] Merge pull request #12661 from Add export FHIR resources to GCS IO


------------------------------------------
[...truncated 295.18 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 28, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2020 6:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-K8fmrcSTv0l6TRs4vsiHPOzwfCmxFWFXvx9uT041s1U.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-04Fpmm_HBkpTcLWWjS3yNpDA5oB2qrqIT00pIQUWj_A.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-_cXRdjsM47PkdiVl9crEcMDfdpZBvbZk3U-a9LhvGys.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8700301419364015774.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vm72pp87x9pX1z4Xhrl51A8vGPnGeQDcCZPA3h1T3N8.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bXT6daSWEXE94J5Z81Rye7hhC_FGCl1qPOoOdoybLh0.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-bmWjOYVAwArsDYEy1GiFRdarTSCc7x78Lk08XPhIF-o.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-c00_2CFrXe9tTVAjGgajMGV_TbgQ8DPzoldRuCYSn0c.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vEApbKFryHG5iIhVN7udC3WsDzZjmtT6r6yHlRQiUDA.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ii6o3k2a4kFNIMn0qWsId7GplEUkIc3fe5-18Iu6bkM.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-jVq83VjPNQpyS6Tu-rN9KNFL9cxnr2jHmxbbU1S5h_k.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-5Zu59VUyBH_G0N_1H0pb1YLj7bnJH1qaEA_R4NKWXfc.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-4YKaqcq4xUOGvqZVrIjTi932GE3fMQoZ6nqdxPFA_5U.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0_9LnYg5dkDQ9S6QBsZblyorY4L58YctkFte4n_g4II.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4iNFYxVqJMv7fTtBF0XlzL453w9x7Npl0_oG07FkdeA.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Nq3Y33BZRLmm3TgHfQoLAIETyJoTdldL6GdFe7HvNjI.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kHIjKvnZffRTfuduPb_D3udVhSnHSU7k3pe8C6BIrf8.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AsguEbI9-hUouwBG806h6hs2ErpoAQH8GJcTes6SdDc.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-K8fmrcSTv0l6TRs4vsiHPOzwfCmxFWFXvx9uT041s1U.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-UwweiBmYAZfaEHoPUVH9WquEWQV5qeWN9vYXygfn1Nw.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-_y4XgO3qdd8YScZBI5OYf-hKlkp0nZDSerI6sm9dD8s.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-94KAsFLzcWiTF9zfMfHnNhEVE7F2sOcm4p-zwbpjVN0.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-phBWoGkJ6Y2vr8fCaxahKd0bLVirYemtr03jUimQzNY.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-T5MfDcyYKm4sRnm3AbIOvFeWGdzWp_ef11fDEtlFkvo.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-YVMLNcoIUYuVjgAKw57qReiHIQK-GI1NH85aCIg7J_E.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-xXIRv38TBAEfeKFefy8pdu8UsAuylsap-WRlC2PlvLI.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-TyUxOwFlQT6_-7_34Lz53kpuiLbQGavpR1ZkLnKDGgQ.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rtKsNg70sjmwE8oYQjj6jRL1edbwR60CEXs4khj5dyk.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Gm0gnad0OVV2NcmEEIsdeMBNBF9iZrx8sLRaEsc4Xz8.jar
    Aug 28, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-q9qLJoc5dojKtR1RReJcr1xS77pPeQmgMk2mv5PcBog.jar
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-9h7y0Aa9etREFLcyTrjBzy0w-R0ZKc7w9o37LK6QMJ0.jar
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-3qWn6Cqr_44wb1xGIakc1WpNl5sbXGKZ7vCmoWhy2D8.jar
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92063 bytes, hash b496a6c2c0e1cd968f91ad54cbd110b8fbdf1d9f1276e8320464863d487f4294> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tJamwsDhzZaPka1Uy9EQuPvfHZ8SdugyBGSGPUh_QpQ.pb
    Aug 28, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 28, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-27_23_45_36-1840404486793715085?project=apache-beam-testing
    Aug 28, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-27_23_45_36-1840404486793715085
    Aug 28, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-27_23_45_36-1840404486793715085
    Aug 28, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T06:45:36.333Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:46.222Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:46.968Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.006Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.036Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.120Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.156Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.184Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.214Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.767Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:45:47.864Z: Starting 5 workers in us-central1-a...
    Aug 28, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T06:46:08.948Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:46:18.732Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 28, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:46:18.760Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 28, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:46:24.131Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:46:42.139Z: Workers have started successfully.
    Aug 28, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:46:42.170Z: Workers have started successfully.
    Aug 28, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:47:16.566Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:47:16.703Z: Cleaning up.
    Aug 28, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:47:16.794Z: Stopping worker pool...
    Aug 28, 2020 6:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:48:34.554Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2020 6:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T06:48:34.613Z: Worker pool stopped.
    Aug 28, 2020 6:48:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-27_23_45_36-1840404486793715085 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1312340b-b092-44ce-9cb1-a01ab67879ad and timestamp: 2020-08-28T06:48:42.846000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.377

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 6:48:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 22.533 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 26s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/h7mrxabhb7c4m

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #924

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/924/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10699] Logging BigQuery streaming insert tail latencies (#12609)


------------------------------------------
[...truncated 297.73 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2020 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2020 12:46:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2020 12:46:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2020 12:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8XjYG79PQ4Kwy96H9U9YREGuAB2L6OyiitPtNW5rN8Y.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-nGqs3qiQcqQNJlJKhConE92XOnnTpMHvWgYdnfuDgAk.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8XjYG79PQ4Kwy96H9U9YREGuAB2L6OyiitPtNW5rN8Y.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-IyO6WUZGgolbV_dbVJG9dd0X060hZdCnN2EAWai40k8.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-A4fXSpwqoeXyLTKxvWw0SFKL7HSmLfQVoHANwdAVRfo.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT--igjGSwbkFfFNCNFTR3F6JvYDGhT6n6BBCJQLuh3rIU.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-4lmb3ssvizqGg7PYAr3I-l1JkhSzxRFwXMVEVW5szX8.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-1d6oHhpJU1QCuXnd7IMhpqG1Y-QmQElfutfAPTYS6Hw.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-axd-PFuTG7WTNTHoz6-BQU6dhOadlo2KEU99QjL6Dow.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-dqjMRbPseInz922oKveppSClVXRgO3CADd9qtDDtTt0.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fmwsK4E_V8pKQnROj0s7i_ff8EL3DGP10m3ABpq5kfk.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-uftgjGsk4Pfy1-fjXqhozH-ZV7krnkMVSty3W2qJXTs.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8028864183046024168.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Eg8EKey5UrvnI9U3om_sEih0LFg_O_xEv-Ox2IOUdvc.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-OCtBh7pze0twfWhuJstFd0kp1cxBNRo2PcygcfXYsBg.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-MyHxRIZN38yTxskkukCiYS_PfDaufOBQvxjodqH9dts.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-JsJy_8eGcqGjb-KTSYFTeoVNxnmPe24mfmqPp6b9okc.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT--VhTkdrJtbiHmA1FRF8HYCRiJBvCbVJi-toyJouQPZg.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-2RcC_xdqSGr4OLbt0BJ27orAX9JQjWT4l82qZtaN51w.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-iS0B_U3H-_Bnnqv8aGntbVSP2KXbzrbufOP1ZSyfZFI.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-1OMKWU_0RphLz3MM-CAd6ZYTPNgNV76Y8s2Eqw0hS0I.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RCQ8k0tLjhRnNFtS2QRxEy-oNV0lNtNWzzlS6Y1JYBk.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pc458lMkTDqa9M9wuzw9ispqKwed2c7Wg-BrwNO1yLY.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-E2-ff5T8Y8naF8xnNRC7rhBRbc4oEbGX57pPr_kYaws.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-DdXlSifaEtAMCun6P3J-NnY-H8WfHCoFLfxXpqQUyTY.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-__s7fCiIt6EqTCGCgtaKnnQ0tOZ3vvguIjeCoorX1Vg.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-P6QtZrDFV3mgcGP3mLLMftkaSwOI3LI1zew_Amq3He8.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-MLTUFgi5609fN4Okr6BODPjLTpwo9W-1IrzZDr6UMSc.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ROQd50DFBJexcnT37kDc0apKauX5vKxQfD1OZzDFm5s.jar
    Aug 28, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-0wyPkGtxpaEV5RIrEg-gc24mT2f5tDuH_wJU5ASbTMs.jar
    Aug 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-sq7_oCRSGrPrwqLyk1We80MCJwAZOc72-J8fx0aKrjA.jar
    Aug 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-uslGs4ibkNiyRlWPTmDNvTsfNyFgRB1sBg_YapN6kZw.jar
    Aug 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2020 12:46:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2020 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92062 bytes, hash f3bdaa35dbd1f7771ebc1f5dadea3703f79162508cd73360f2f1cde89cd51850> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-872qNdvR93cevB9dreo3A_eRYlCM1zNg8vHN6JzVGFA.pb
    Aug 28, 2020 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-27_17_46_17-16009484370576887028?project=apache-beam-testing
    Aug 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-27_17_46_17-16009484370576887028
    Aug 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-27_17_46_17-16009484370576887028
    Aug 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T00:46:17.509Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.030Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.613Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.654Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.693Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.777Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.814Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.848Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:25.886Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:26.321Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:26.411Z: Starting 5 workers in us-central1-a...
    Aug 28, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:46:58.179Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-28T00:47:00.663Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:47:21.514Z: Workers have started successfully.
    Aug 28, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:47:21.548Z: Workers have started successfully.
    Aug 28, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:48:00.141Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:48:00.294Z: Cleaning up.
    Aug 28, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:48:00.366Z: Stopping worker pool...
    Aug 28, 2020 12:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:48:55.585Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2020 12:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-28T00:48:55.627Z: Worker pool stopped.
    Aug 28, 2020 12:49:03 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-27_17_46_17-16009484370576887028 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 31d74a76-413b-4fb8-8b9c-ff2864f1c292 and timestamp: 2020-08-28T00:49:03.106000000Z:
                     Metric:                    Value:
                   read_time                    17.285
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2020 12:49:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 3 mins 10.471 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 46s
106 actionable tasks: 79 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/qldgmqylmryqo

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #923

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/923/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10820] Delete dead code SDFFeederViaStateAndTimers.java

[Udi Meiri] [BEAM-10701] Fix paths yet again


------------------------------------------
[...truncated 296.84 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 6:47:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2020 6:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2020 6:47:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-vJ9ebWKsJsnheBvueENpwyr2XaJ6iaiGz5wROYDH7iA.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-l4EWNWi5YQrK2CyWuGGS5YLlMsju891yIVauARBHbbc.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AxTSq50alPmkjD6j9_Co2JIXeFwDuTiIlhq6ociH9H0.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-quIUmd9G0YaUI5gYBJXYlPiWfvscVG18335oFt5DeFc.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-iNmhEZkJ6iTcDokCFwhKUVwJhP1lgI-brgz1QhpIOuY.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-shZ5Eq9tsOOn5sXz7QNNYO8Da3t6ltUAe3OsLeD49Io.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-dLpV7eLC75s-1STKOFRGJ5ZbWsus9fJNtIzVFLQAWAc.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-vJ9ebWKsJsnheBvueENpwyr2XaJ6iaiGz5wROYDH7iA.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-2uBCQlpxz1NBiDQV8ZBblzdwGSbw8MzF133WtJY7q8U.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-e_DAGAq4X41R4bfZ_Dr6sI8uBEU0oVhpvvt4Z8XtI2A.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-xAit_PuP8t3RE2w0mMjqp5Fsrc9qlO3xfo2N8Aht_nc.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-bhT3OLoApbU3OZxeYjtoDaHo3VkaiOA60iqcFlZFQCk.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test199148227696144760.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1MhBk0oNVZAX0nFzA81hbCIbR3HoI1u8IApLgP3E8DU.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-DPfo2yCBPGoqfIrNgjrs7GfdVG9aVng6uKmNDnhmVo0.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-8fkSji8MNT4511EupzfrtbPFgMADLA-P2jxOOuTtmkg.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-EFBaHf6PflGx0MqO_hUY86B_L_RpcVQE2-hu7d1CO9I.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-GqD1EqVsiShu8SFsWkbZjA4AVxXaOBW4Fxt5leAXKKU.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-IdXTGyYP7Rq0rWH5LMnkT-1GMVCGImHF5lWjJDPEppA.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-TOW_JK6wVE-8JmxgpiRqRORdPN1rgaAIPnuHar6dzUs.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-LZlF9ygFWI2P8HPGadTuUI7k5o64LSsPWtkWwnvgfL4.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-HsqOcIC4q4SccST7u--Z6W7o4LyoDP1rntI9oqSbecI.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-pK8ylgbC6Tyaq9Z352lslS6cieb1GiVHegVGudD9Lbk.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Zy3dLPwDKDdgbae-Jv8hMeicZeaCj3zmYX5PWCAR9dY.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-NYcYP_vPCx2abVLyKmqJui5GUbn6FtOVDWnwJNtN-08.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PZKSDer-dJmUtDgPgUOpolbGbffPH_gqzeGBB3sCLUY.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-wTt7xmeRYTleqDumRzqNMqEDqi4xb1Cyhn1WgIEeNQE.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-KhLSDHSLVEyEexAw0t5kSq7XldVUTVgF1IRfpKmXJv4.jar
    Aug 27, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-DubM4o3gsRxutYqKWm5FeIOgjIjZgAJh1-85pImqIDM.jar
    Aug 27, 2020 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-pQekmM4jrse5cupSGZ8WozFUnbaqYrTASAE6o8P-I30.jar
    Aug 27, 2020 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Ncu1FCo1MhOMrlNHXl1_JaeHQnL2IKmzMu_vDNCu0OM.jar
    Aug 27, 2020 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-gi34lk6249SPysDdrQbwprE20yXB8xx67TlTuciDvkY.jar
    Aug 27, 2020 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 27, 2020 6:47:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2020 6:47:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2020 6:47:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2020 6:47:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2020 6:47:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2020 6:47:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92063 bytes, hash 50b4457fc00dfe392ac9bacce9249772f06116396869ea57424545448fc3c00b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ULRFf8AN_jkqybrM6SSXcvBhFjloaepXQkVFRI_DwAs.pb
    Aug 27, 2020 6:47:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 27, 2020 6:47:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-27_11_47_35-11101564419866288284?project=apache-beam-testing
    Aug 27, 2020 6:47:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-27_11_47_35-11101564419866288284
    Aug 27, 2020 6:47:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-27_11_47_35-11101564419866288284
    Aug 27, 2020 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T18:47:35.408Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:53.393Z: Worker configuration: n1-standard-1 in us-central1-c.
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:54.939Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:54.981Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.007Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.132Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.169Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.203Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.238Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.692Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:47:55.774Z: Starting 5 workers in us-central1-c...
    Aug 27, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T18:48:07.409Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2020 6:48:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:48:22.477Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 27, 2020 6:48:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:48:22.515Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 27, 2020 6:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:48:28.092Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2020 6:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:48:53.253Z: Workers have started successfully.
    Aug 27, 2020 6:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:48:53.344Z: Workers have started successfully.
    Aug 27, 2020 6:49:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:49:38.073Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 6:49:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:49:38.285Z: Cleaning up.
    Aug 27, 2020 6:49:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:49:38.383Z: Stopping worker pool...
    Aug 27, 2020 6:50:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:50:19.276Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2020 6:50:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T18:50:19.345Z: Worker pool stopped.
    Aug 27, 2020 6:50:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-27_11_47_35-11101564419866288284 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 51529d5f-2f30-4345-85c9-3c81de4a7573 and timestamp: 2020-08-27T18:50:29.023000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    21.546

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 6:50:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 7.616 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 25s
106 actionable tasks: 74 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/4mibcfyk72tri

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #922

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/922/display/redirect?page=changes>

Changes:

[Kamil Wasilewski] [BEAM-10675] Clean up Python Load Tests


------------------------------------------
[...truncated 293.40 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-DlHW6VxMPkfhRI29QGP5FwoRGxH7K51978CI-FP2Ra8.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pvoW6h-vudLxdOrtHLlzrW7HjxquMm5vld6H1i1HUR0.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-toPkRhZNQ-LjdugJuRE9w20rzOoyXI9Ek33mztKum5o.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-GeyVEPdDgnMV56Sh98eGqtN2QD96cSAlc7ozXM4S-uY.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-XXNabNgNfBJwcthmeFHLb2lS_1_C5HioQRfwB6CRxkU.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-bq1oLInx-uW9F_bgGbAo4Z6F5AVb_nMZG_7LaZrk67Y.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-VxNY3tvwxBSsrD10kuIhadLbY6GXGWI3VNNotLQvgSY.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-DlHW6VxMPkfhRI29QGP5FwoRGxH7K51978CI-FP2Ra8.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7643848307065758783.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yHGmhEb-B3snqdpjUOQmi6oTSulyVfQ5agE8vo7_58I.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-K7YsFkiSV3qO8Of1FUtMjnbGczjgJpnILZ-APg6tLwg.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-QOOAOO5KlHKy3wyIsQxuOFesZ6lXV3WtCz9wOIGGUqg.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-SqR9Ujl67qk6_dDeo2gMeFukTFDLmjLjBEGNfknFL5w.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Drpl-CNzfcEvihu_3Gyc4H5UkDOOGVRyiHCMYO8ZFsg.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-5TrslfOLt8Fb1dD0lMJnD7mnusoHl2EHSnmcVcl3UaY.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Ett0cSeggnAK36BupNGRIVkgnX-De536OOsz1UxXufk.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-tu1zubZ5UiiDpGr1IXIVs7hMXtvk-kdvKFGfj5vv-pY.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-bQJOzdJyR1YFK_akFD7_cnBYV6fJqSbFY4CSi7G5mkM.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-VXd4mxv0nvHn29iMT89Ani3PfD9RbXnGKviBlBQFXts.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-hhfJWjDf5GOSiDI1mm1FewHNhRY41tkZ67SBHkS98-E.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-aoQzyCsTysPMUYJQoLPH4noKO5hGcjQvVrakfPDwEAs.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-2FO1rq4zZJ1neS0uqgf4oVrkJvJ2Qg2UZdN87CJwpgA.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bI5LVNd0ZDA2AADYYOf-5ZxcxEikpmDLqiPRE_j-Z94.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-F2-jTDlQppWIfDdI9ifgpxsKhZltacQwkIakII98c-c.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-U48iIxTnHsz5pNYxWQj_JnKh1WB1CfdNOq0KeuU15BY.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-0OyCwtPdaONX9xoiErKjhujp9xmuhBit_70nPVciPY0.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Eq3mqSuNJOkezq0XGNsKGwPQa_4WSlAwol3s77H2rP4.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-nV1jrPD3ftseDRDyHGYeUpEiSYQcrdezIn0yhV2TDeo.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-5M049fxI5WkZHPQ3iOGch8a5iipE7cUUur1lq2_6aNs.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-os34oJMDTIqOgltPBGUhSVWYOETt7kyiv9Q1Kst3XnA.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-UE8FcO0UauupWhXWg83fk0VunD1A2qzFQ7gSMCvpHoE.jar
    Aug 27, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-OghHr5yQbgif9gkK6-yLmKfD0gc6-1i_QM-5CFBW2VE.jar
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash c44fcba2c4e92c48a7bcb144f5f6aaa9b24289c87586d6d9e14c13c09d8a803e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xE_LosTpLEinvLFE9faqqbJCich1htbZ4UwTwJ2KgD4.pb
    Aug 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-27_05_45_18-17355357856432410541?project=apache-beam-testing
    Aug 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-27_05_45_18-17355357856432410541
    Aug 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-27_05_45_18-17355357856432410541
    Aug 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T12:45:19.015Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:26.771Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.144Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.188Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.226Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.307Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.340Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.375Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.409Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.815Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:28.879Z: Starting 5 workers in us-central1-f...
    Aug 27, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T12:45:34.702Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:45:53.518Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:46:19.999Z: Workers have started successfully.
    Aug 27, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:46:20.031Z: Workers have started successfully.
    Aug 27, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:46:53.485Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:46:53.850Z: Cleaning up.
    Aug 27, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:46:53.935Z: Stopping worker pool...
    Aug 27, 2020 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:47:45.281Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2020 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T12:47:45.326Z: Worker pool stopped.
    Aug 27, 2020 12:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-27_05_45_18-17355357856432410541 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3783504a-7750-45d5-bb6e-d4951fe63e4d and timestamp: 2020-08-27T12:47:53.633000000Z:
                     Metric:                    Value:
                   read_time                     13.77
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 12:47:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 48.04 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/jx3q3e66duabw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #921

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/921/display/redirect?page=changes>

Changes:

[zyichi] Fix PairWithRestrictionFn process no attribute signature error

[Boyuan Zhang] Handle split when truncate observes windows.

[noreply] [BEAM-10821] Add Python SDK typing improvements to CHANGES.md (#12693)


------------------------------------------
[...truncated 297.38 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-GR0b_GD4cPZvtA4nld63tTYzeBwrWIzZfR3_RQKBECE.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-piFnSnUMcu1eyaFgiLpwVejxmFwfRvx1-Y130BpQuPc.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-s3CMU1ZPYn5OznCuHth-UPr_rEvWe125BGbAmOXZnbM.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-qm2oYgOqGB7pHOByTwXp5eh-m6QQhDYdpe36TuYl158.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-s1hyXPwOainNyN92Yve2esNDZsW-3noNsypB3HvJ9e4.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-CfirEnKWpo5WX8Odk-HKgOfky1QT4GNHwsIbTWUr99g.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-26viMTJQH--zkcphyuXCgh3wJIxUCp_ZEPyhXYb--mE.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5743961621650067682.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kbyHstvy-orPBaWVvGsoyZqhhP7YFlqSL88MqBxBUd8.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-GR0b_GD4cPZvtA4nld63tTYzeBwrWIzZfR3_RQKBECE.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-4Pq2HDmoDTnUE7ORfyQoQ0lpAiuCTXFBk8Wu_wvpJ58.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mGMG4sR4M516UfYOxak-QhsIX07wj4UvpbU6Z0SQ-fA.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-ddf2Dth3EpljDDRnGE9zM1b6GErck4Wnk4WBYnajeHA.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-k5GCc-29Mc8CsfLNwY_rFVyqtyfAzq-DDiOdVs2q610.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-zWJYRqPL2Ibj5o6Sfs8WCSJu39Tkn-hVQCQMJDOmDOY.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4ZRu5LasCPjPLFDRAKeRywiVtLgeqYk5Dr53G9CEtY8.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Za8w8JGDXf57KAqnslQr_bQnUjzQwWkiNehtPiH5tEs.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-f2MPLJXVWacDmeOoOQUAqtEXd-BFZSWL4UOulLl3ryc.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-X2yZ5lhnLYvapJ0CihIOfZHNQ5ay0qxWiKXWsnD1uV8.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-pnMbz56X504hR555aPRhXb82R3J2WximJ7dGaOTemdg.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-99DVRtexf24_kv8vHTzbe4kO-I3C47PDdZFXvKrgAP8.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-dKbkE4B_oWqr6jzPfDgfCxgqontGLRnmfR_HsJEhYr4.jar
    Aug 27, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-O-vhtpVXcPf44MTVWyinUpfPzvkkY_j01tEqYHDFcDA.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-xOiYvjHLBVxyMyVBRA3RziA9VbalcNVVfXjgrMdP-vk.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-6182-dcSn_ykek3NgBJJNUcwBq2i4WDePiFhTmWIdBc.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-LD307R6ZXGVYlGyXKGyNwAX3LUok8vVSsL2Q6CL0lrU.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-6VVQvRr4kixtEa3ruFY_CYsQNiTECLMTQ6pWzr-8fqw.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-OauXDj4jr4z5UP1G96uxrNFsp632Y3S9jiw3bp7xcic.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-zvQvEEABI8qEYjZOefSOxdi9KXoBJJQNDhmI_hzuZXA.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-AYxnbNsojyq8cudUy5Yv27sSlQTfcwdJG_0UEaD5ESs.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-JbDvCDg-hbxjAn8w9jGpQFvOPUCbcpTEJqnnXMMq9Pc.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-vMw_fyap4NyO-GQTYb-Uc67o7n1EVJZklJMr74pAXyQ.jar
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash bdba5d5519a18ba463a29e9684bc7f5bdc7fca29ca7e4223b7e67d234ecfa68c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vbpdVRmhi6Rjop6WhLx_W9x_yinKfkIjt-Z9I07Ppow.pb
    Aug 27, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 27, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-26_23_45_34-17204497299830570484?project=apache-beam-testing
    Aug 27, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-26_23_45_34-17204497299830570484
    Aug 27, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-26_23_45_34-17204497299830570484
    Aug 27, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T06:45:34.439Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:45.294Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 27, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:46.854Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:46.916Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.062Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.136Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.167Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.201Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.238Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.723Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:45:47.793Z: Starting 5 workers in us-central1-f...
    Aug 27, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T06:45:56.688Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:46:11.135Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 27, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:46:11.207Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 27, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:46:16.613Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:46:33.608Z: Workers have started successfully.
    Aug 27, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:46:33.638Z: Workers have started successfully.
    Aug 27, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:47:10.133Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:47:10.323Z: Cleaning up.
    Aug 27, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:47:10.417Z: Stopping worker pool...
    Aug 27, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:48:00.752Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T06:48:00.896Z: Worker pool stopped.
    Aug 27, 2020 6:48:12 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-26_23_45_34-17204497299830570484 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cfa52fdc-f140-4bc9-bb05-c9ea2dba13a6 and timestamp: 2020-08-27T06:48:12.209000000Z:
                     Metric:                    Value:
                   read_time                     14.67
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 6:48:12 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 51.143 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 55s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/f5nw33ajudcx6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #920

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/920/display/redirect?page=changes>

Changes:

[leiyiz] implement query 12

[saavan] Fix author names

[noreply] [BEAM-10701] Add codecov config, fix paths hopefully (#12684)

[noreply] Merge pull request #12674 from [BEAM-8258] basic metric feature for

[noreply] Add skipped tests for Series.dt and Series.str methods (#12615)

[noreply] [BEAM-9547] Add a few more DataFrame operations (#12682)


------------------------------------------
[...truncated 292.21 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2020 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-5RR2CdmIMT-qi5Tzb6pOGEP8-bvcu3G-RWCBzhxLo60.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-pzN4yPxiFjTKqJOZd55OZ0EZlHYJmYpJKxsow_mQQOI.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-hxzXES4J-jVA681dUkEMu6QcbomQ4O-wKZKMcavSitE.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5HOvsL1aCf78akKYQOSV6rMRhED8oMYrDM7wwmzXII0.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT--4_ixrj0tK-fD-Z9vkW3NDJI_ZEOOxf2K_pFNfT49QU.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-6t2OO3JieC15siSwwBY_Xjlp4TJ_m74vSxEZf_TcqXE.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ugQPTmLouMCXHFImaHvobGNr4yLnZpiZIrUPTIdUeaU.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-FKTPIlMobSMXM7OSR-krLdXiYcaykwDXlDqEtOfGtrE.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-kMgx9VCs0odJSw1gV06k6f2lV8nHu-q0j9taNJT7F3s.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-A3UglhMYuCwFinRuu2eCY8836MDQtxDQuJEQk3WyPAc.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-H2v2oNutN9H7wFEtIwM-89GBxyGnxsAhSTxh4j92vaY.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-SemeBUbDDaDWTJ13-2suGEksZQ00TPruHarsObvxjsU.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-J37XZNd1ghZJoTvXanwEoxrHXkBPTNDSLhML6Qvvdgk.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-5_tkjm9Fh_kkqhJSunfcMV5HnCr6espPtJjMFOywGIY.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-M8k4Bx_GZlJ7KQGyyi7YndhaAEm70MC16a-OEuvdT_4.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-5RR2CdmIMT-qi5Tzb6pOGEP8-bvcu3G-RWCBzhxLo60.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-66-gIbxm1dxyOjIpLnxsE9BXQBY_BJUE3c-57r6uXvk.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iHZ6Zw0jDlMCSpOZGG0RQHBELZy3cLW3gU2Hvu2fRsU.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-udCPAqANVc8yl9-_63ULyniwmb1vp6faYSnwsykhg-0.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-ZOQ64jLEdtPtm-LuWujLepbFL24OQPRe0uqpX9EXDeo.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-FYQKxBYUoZxUWpHcjcbJIXWczmnjdd0Dy3kg0sI4GYU.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-hExNrLalLA-MZazbc7jzEnDUQONKwMMuMa6kl4uK5ZI.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-xSvLCh7BoA9jZ7qHoP_2rsYDzBn-xTFkKt1rmg7Hl3s.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-htj3LJnNdcWb1pHFKm-GbWejdWwvEfvK5WJlc9vlptY.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT--_GI2jVUt65OlgNYWnjbpMC5LUWZrwdYOvG3IKkdhQI.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-R9XHS91A0Xy5y0cPRbrSipC69OGyOkWGdEfUVnBAX_A.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-XOsl5YmIukEJp8ex4U9b6V5nOQR9BIrD3GcyPWnTAc8.jar
    Aug 27, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8921028953561469495.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-23V2ja7tGi0P0yH9nBqpQr0M2pXXKb3i5rDiTgLb5Y0.jar
    Aug 27, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-O3SN5IAq0aYVFDz5KpHtBCUxPstQCuh6WEKQQLNunBs.jar
    Aug 27, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-A_vD_vtHbccU28mcJiho1VHQ7GoXTNODy5xvqIjE2C8.jar
    Aug 27, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-sJTfJ-RZzuk2xzTx4VGtUlv8vaUZBZ4ct8nLMKzmOK0.jar
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 32a64ae8de4d97842be90314cfe5e911f2e7bb4d8a26f8e5fb15dfaa0cc5a1eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MqZK6N5Nl4Qr6QMUz-XpEfLnu02KJvjl-xXfqgzFoes.pb
    Aug 27, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 27, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-26_17_45_20-5542606730585426997?project=apache-beam-testing
    Aug 27, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-26_17_45_20-5542606730585426997
    Aug 27, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-26_17_45_20-5542606730585426997
    Aug 27, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T00:45:20.886Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:28.169Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:28.841Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:28.884Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:28.910Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:28.986Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:29.023Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:29.057Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:29.092Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:29.545Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:45:29.629Z: Starting 5 workers in us-central1-a...
    Aug 27, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-27T00:45:52.896Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:46:00.054Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:46:21.794Z: Workers have started successfully.
    Aug 27, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:46:21.833Z: Workers have started successfully.
    Aug 27, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:46:56.577Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:46:56.751Z: Cleaning up.
    Aug 27, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:46:56.879Z: Stopping worker pool...
    Aug 27, 2020 12:47:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:47:49.885Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2020 12:47:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-27T00:47:49.929Z: Worker pool stopped.
    Aug 27, 2020 12:47:59 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-26_17_45_20-5542606730585426997 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b28f4b04-4c76-4019-8979-76d431c54dc0 and timestamp: 2020-08-27T00:47:59.420000000Z:
                     Metric:                    Value:
                   read_time                    13.962
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2020 12:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 52 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 57.432 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/oj7g7rgzhdaus

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #919

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/919/display/redirect?page=changes>

Changes:

[sychen] Refactor to reuse computation+key

[Luke Cwik] [BEAM-10771] Fix whitespace error causing whitespace lint to fail

[noreply] [BEAM-4980] Upgrade to Netty 4.1.51.Final and Netty_tcnative

[noreply] [BEAM-10670] Make Twister2Runner opt-out for using an SDF powered Read


------------------------------------------
[...truncated 294.66 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-T9belIrp8lfsYa9KPbe3Xft2J-ImGa5AwlKrgiaV7XE.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-wkTYTneBmAosIs27-Ye6yAE6OcypYDJcsl_ycULp6Uk.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-cHwmW3DAxFhoHubQqon9T0DCgMWIbWYtaK91r1VO5Ug.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT--tRGO7k8PtwaEeE90P5LJQJKV5CNXjWZilq1Lc283Q0.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-2Bnh6TG8R3s_8yDx_vMoKjBXhAFnygeL3pIB9unUdvk.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-xaqmjKGh7DMwaKBSj1VZ-RyyRy0A6B2Vd-xOCeD8Mdg.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QSZuxROSk-77TV6CiejHcGA0v57B2it3Hr4nrQHafqY.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-SsqwERGJaFqgF6GVhSXYWluUBN1DmvW40KYRyioBEPQ.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-rypWTAanf72jWYNUFP7PVy3csz4oA7t2DMTAwshncvM.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-afBjMVprgrpmJkmRol-t7ulGnfe6UKFM1qVRLTXoa8M.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QqyRLP5g0cBvmDnt9RyX0_ptFTBqSROCOCZXYyG70hg.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ghGKnnZt6GEdGAVgP-vyEIoEFeL9P7oM8SeUYukNTw0.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-_AexY0YW6PhrvLOV_VfuA4NrYR3ClthCP9MxovUI7zE.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-65wksxCAHmOpCi75HYXLmQZOmoYQ9uTuBZdFNrFL4ek.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-T9belIrp8lfsYa9KPbe3Xft2J-ImGa5AwlKrgiaV7XE.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-gwqKBH0Pkm805k76x1WdRCD86-5IgWJt-_xOIaAF2sE.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-GEaeiPUFto4oX_UTY-hftK-nrfjVtkEyDyGvf3VYq6M.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-lqsmgiBOwgRfpm_eHg4Uk595akKMRh60xXIwGIs3-wk.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-9K-PdEUjkBKVT1h0FdacXvTlJKmJ0vOsmZtRBCSES7w.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5557057884791426881.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vf9Ed6433yYgVcPN6NaqwqLWBTBA21f3FVv6yzfoQn4.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-4vhbmLm3uQd5bLOc-sGn3nscQENh0hhI-pYf4cWicNE.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6ldwB0DX78F9WOGnW4YWxEIkl6hRTVY51dROGMuRGfQ.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-B6DGK-rzJxWplif5cxhCHGleXwkMHIa1kG7kZRqrkF4.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-vq3tt_kLnmZtWNgsfG3cRH8VYgQdi_TpeZxGA9UInYg.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9K7j_bF61Z1BxJtmvkN6Vo3GlYj6uhFZeltrq7VVXFs.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-_Aq243ld85zw0B0pLjO1OE15PQsy7I-qS93TZsUri3Y.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-418vAFpgLkoeNQPM1v85Di-b0igIw1o_3F7EOHLhRHY.jar
    Aug 26, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-qz40320zVBYjj86sbtsR-G6p00-vq_MI0fpVjaim50s.jar
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-OrRIGEGFrjJcy5a8PXNDbcksA4PbCVCItUJN6WZA-PQ.jar
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-p1OC_OF-t7Tt2fgRaYy3hGanJ2Mbu6D-bUFu2um_6Ec.jar
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-twGnSfwhCBjkS9VeJxqsLIFXTivbp7T7x7GD0wKpytU.jar
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 77d1ae9d1b874e65f8aaa1d7d4cf74fdd2bde789d0c8653131360ad144e0bbe2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-d9GunRuHTmX4qqHX1M90_dK954nQyGUxMTYK0UTgu-I.pb
    Aug 26, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-26_11_45_36-3926239607567426902?project=apache-beam-testing
    Aug 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-26_11_45_36-3926239607567426902
    Aug 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-26_11_45_36-3926239607567426902
    Aug 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T18:45:36.088Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:43.405Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.033Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.076Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.103Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.343Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.370Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.404Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.440Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.835Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:45:44.913Z: Starting 5 workers in us-central1-f...
    Aug 26, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T18:46:04.124Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:46:16.076Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 26, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:46:16.112Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 26, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:46:21.449Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:46:41.423Z: Workers have started successfully.
    Aug 26, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:46:41.458Z: Workers have started successfully.
    Aug 26, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:47:20.607Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:47:20.750Z: Cleaning up.
    Aug 26, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:47:20.815Z: Stopping worker pool...
    Aug 26, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:48:14.002Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T18:48:14.046Z: Worker pool stopped.
    Aug 26, 2020 6:48:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-26_11_45_36-3926239607567426902 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6351f5e8-1cd7-4b7c-85e5-76150b6df1a8 and timestamp: 2020-08-26T18:48:22.008000000Z:
                     Metric:                    Value:
                   read_time                    18.196
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 6:48:22 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 0.058 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/s3bx2spgql2j4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #918

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/918/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10675] Add Python GBK Load Tests for streaming on Dataflow 


------------------------------------------
[...truncated 293.07 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ubqw-asJJdVq3IXA8IojEmbLuO-pQr76ttVbVjp8Vco.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-WcIll2HJJVEOSseVaenptY0loC5OEpJ0jgrjjClgsRk.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-5I8hwG4pmJaBN6D_44OVFtx8GjXGV3E2huCst7eYxpQ.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-q7dDlWzImc5ng7i0xFZZ9kc_n3NqTrDW20qlk3-hyLs.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-TA5qqgx-aoM7oHr8qe4vpOZFwdkUPDw-ar56cvkYPLY.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Mt80TloAb6L3Pxk20c5fkzmE-qKlS5JT0BaLEFWRBGc.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-m0-12Vs2MkWCy-XPv7gl_sxXXYXREsWyCTA9GfgBgmY.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-cUuhhIrNPetT8m6KiuAvqIMyE7Hl6DBfECxPwd84ZWM.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-6RwlKogATYR22w0yg7XAR8TU9pXRuOLevIXQD5C9N7A.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OC6sC0Y8uZfbvLzhcD81Id7fFl_dRcDPe8O6AQ9zkqo.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-T8v_inI41K49JNdVOoAkd8WL9vhHRNqwiipLc-OWbCU.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-qhP46W8AsR2zC7IQDtfWLdjhyZOrcNGSyA-TjpLq_38.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-55X0hBeOW_CddooAv3nN7xyuOK9dd9Xmke8_ir2gjdE.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-yNHH9lPjQ9ZD82XoViUL3dY5oL1DuvNNlM1fGccr2dI.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-vaT7z35IGCV0T_o-B2ZczB4WCbOuzgc5WQqSWBQ6aDI.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-lTuYYY6IRWolvSZeRywdrj-DAoUwdLYMuiM4TMhoroQ.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-eCFDzj_qmnWdaXWolVnKv2HMgUKYRBow_In9uCKqEQI.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test141836003757153621.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JpIYbafkx9PrQIM4YtAy9HyEAkeKxLiEY4-aqEzW9Z0.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Dy5iJdMJH10gnohDfgnnrZxuu6mU7lpfLY_huX11_tw.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-CY00uiBAMGF1_X-Nlg7Ak1W895lSj9WpdgWYaNz_02U.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-eQ3xSJgmNV_sylx326HLAm3ryqRs1ye7TkItd2NyfZE.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-NE9q39NRX_xHIxBYR35AvCOPXpTPj9nP6Hy01Ik7MjU.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-3Onb7KR2w_698mVuYQBG3QeQ9U0AXOFkVBHI4DwOHgw.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-bzS67XqcvS9rRCAijNdd5_mYvtUtiny44FAS9vrF0Z4.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-uwcvO-jTtxW8AV4pk4sprw1o9OWxaYJSYHS5vwnkAaU.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ubqw-asJJdVq3IXA8IojEmbLuO-pQr76ttVbVjp8Vco.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-VIkyI3QNKwDJK0algD4Ydm00Qm52bIqZF2BpnQhaosw.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Qq5N5i6AnxLbXLjtvsMAk-ooTj0lOqbsG05qWBJqPLo.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-70n054f0G0eWyX-4Lz17ILQeOk8pEFMfFPATU0H43K4.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Y6U9KrryClfO3DTcpY8Hymj_HNIEqr3pboAGUtC2jQ8.jar
    Aug 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-hxljJLhX9GKITW_xlIDlfrPbUYTkx0Eu4wkxBf2mzco.jar
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 41ef33b5fc26309e53853332d2a5761b4dcf7dbb73baac6c52a3d236c98eca1d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Qe8ztfwmMJ5ThTMy0qV2G03PfbtzuqxsUqPSNsmOyh0.pb
    Aug 26, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 26, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-26_05_45_18-17619971765402214936?project=apache-beam-testing
    Aug 26, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-26_05_45_18-17619971765402214936
    Aug 26, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-26_05_45_18-17619971765402214936
    Aug 26, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T12:45:18.245Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:26.521Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.206Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.248Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.274Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.356Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.396Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.426Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.484Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:27.957Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:28.046Z: Starting 5 workers in us-central1-f...
    Aug 26, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T12:45:35.493Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:45:54.541Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:46:17.689Z: Workers have started successfully.
    Aug 26, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:46:17.728Z: Workers have started successfully.
    Aug 26, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:46:52.579Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:46:52.730Z: Cleaning up.
    Aug 26, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:46:52.835Z: Stopping worker pool...
    Aug 26, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:47:38.001Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T12:47:38.043Z: Worker pool stopped.
    Aug 26, 2020 12:47:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-26_05_45_18-17619971765402214936 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 38163d1c-ec7e-4696-b4bc-ef95cacfb930 and timestamp: 2020-08-26T12:47:45.704000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.051

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 12:47:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 41.534 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/rj5tojdubufwu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #917

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/917/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9680] Add Aggregation Count lesson to Go SDK katas (#12646)


------------------------------------------
[...truncated 295.47 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-WHW944xAtSsEfaEQsYbq9Uj5oYzCJRwhwS9JrOxNSoY.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-RiZ0E2GEDKdFtmPGItRJW9-S2nXrCdLgkHmCO75A6uA.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AwMDOPvTN1QwWyynSQDaJ82vsx_A0HU487GIBuycFV0.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-y18dUqFiMYX81w9JxFVNUroYpSmPR61uv651r5K5Dt8.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-bk09UM7S0465OLCx00w8yKaUWFYZEMwZGL6lBq_TxSs.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-eYdrP4sr8NLGpAHlFu7VuLmfLOi7wAeQQoq4BPXsVvA.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-oMxCWs8JNyoacn4U0Zk0O1CpHBBH-d5udAWRumIpATk.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-q7F03b4eL5sF3xaT1IThhD_6IBU5cGI6hsW9JEN0fxU.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6d1N6nZNOVsoVEQzVsFhZbbGT2DsPoXWejA9ZAXh4DM.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ukyz9i49MMH9vS3oesSc_vIWxmMmoeLxB70LAwaHLC8.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-b2XyJiDW-T0oJMgbMX9GkdcKt3g7gAIaRguf07I_eoE.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-5_s63H9YC8lnwfb48AhPvq4nBzXXbDId9ZLMXdGBK0A.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-r0tT8GI8bB9CbcKSHXKvmdHLEhxMhM99QsDqrJfjWC0.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-fL_WkqBcqDGUZixmTD8ceTq2CZqlD2U8r5DeWbDe6pw.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-teafaw0-KKUIWG1CDxXvIOliTUZlWCj7DwDHtSlTBMM.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-N7ScDl3fhMtXeTFkJjqyzBymjy35Yupih-E7KihYEnI.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-WHW944xAtSsEfaEQsYbq9Uj5oYzCJRwhwS9JrOxNSoY.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9ndEiarVqIp67g34Qlvf-U65ULsf4HhRKdkomv0wZEo.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test270511413285148974.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WbUqVM-f1M1ay0TcqS5NILOSzHkrAs_oFwsZRkhbrKs.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-hrnq9aKuoPGFYPE1zbhHSraeF2nh8L7WFaeXoepv4_8.jar
    Aug 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-kHBRuJdpBUflkZW9hAX1OXkjFqFy-XcYDOVRJiA7y-4.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-JIbIQQ3eHObtMukbnuH08z7Kds2ABBNuiL6DMaCCJuE.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-nFKjYXw1ieJhOj1zs9lfNqIHYoGPcdXT4DPfaPHi3s8.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3w02st9opYKO2CpPNbJBvZw1H6ifMHSq00DNFXbPO30.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-wozB-o5FNo3CJoFZU31JrM5PDfPpTHC2oZQZ5JoRENE.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rkgk04S_hJ0JVR0bEtbIYQFfbjgIBlJJc508aXJ8P6g.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-WuxutIpstndrta6IP7MmKoLZvZrDW3P9f9AUnw1K-o0.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-pwFZqn9a5r6SWXUTLKpQvaCqGKAZgz74Smy_bZHFHzM.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT--P4hhSPD7aXJ4C_XBRuR1Ivubo9Cn55JOoBLI8a9qkE.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-l1y77kr2qG44YWGBwjTLu5fHmRjvC_kob3uCUFMbiVU.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-zCPunEXOvRBjX3sFQQaeWgWqqca68yru4FzQBqQh2ak.jar
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 42fb153953645719c9e02d4e695d7b49d42811564d78f90858089f78f5f1679b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QvsVOVNkVxnJ4C1OaV17SdQoEVZNePkIWAifePXxZ5s.pb
    Aug 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-25_23_45_21-1781052873804814194?project=apache-beam-testing
    Aug 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-25_23_45_21-1781052873804814194
    Aug 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-25_23_45_21-1781052873804814194
    Aug 26, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T06:45:21.535Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:29.978Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.543Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.630Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.659Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.741Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.772Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.822Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:31.855Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:32.432Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:45:32.518Z: Starting 5 workers in us-central1-f...
    Aug 26, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T06:45:40.509Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:46:04.422Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:46:32.484Z: Workers have started successfully.
    Aug 26, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:46:32.529Z: Workers have started successfully.
    Aug 26, 2020 6:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:47:07.494Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 6:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:47:07.637Z: Cleaning up.
    Aug 26, 2020 6:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:47:07.715Z: Stopping worker pool...
    Aug 26, 2020 6:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:47:59.896Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2020 6:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T06:47:59.958Z: Worker pool stopped.
    Aug 26, 2020 6:48:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-25_23_45_21-1781052873804814194 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 38d26d39-662d-4b97-a584-c469119800e4 and timestamp: 2020-08-26T06:48:09.199000000Z:
                     Metric:                    Value:
                   read_time                    13.428
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 6:48:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 3 mins 2.888 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fsuzrpx7sir7q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #916

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/916/display/redirect?page=changes>

Changes:

[rehmanmuradali0] Adding key in Spark

[kmjung] Update BigQuery storage API source to v1

[kmjung] Make stream progress reporting work with v1 API

[kmjung] Clean up some leftovers

[kmjung] Address code review feedback

[noreply] Dataframe set_index is not index preserving (#12676)


------------------------------------------
[...truncated 295.12 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-MR6WjQwJ_7_uUCvGBbUhGl2KzrEU_bveTCOO2jGXEFk.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3911476810051603262.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aK02AnRg2fziNJo9j9Ugj5Zqhbyu_17IqVs_49zzMMc.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-4ke53wsYurohuYcipg4kSi6YKoL0czrseFwunOvlY8g.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-9rzRFTaJQZHgAjecoDe0grEk8P3r8G0IszEK8EZd3ro.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-02sEIZZUJuNfbl1H0h_Y-jyI7Q4FB8mkqWSU48P3wig.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-MR6WjQwJ_7_uUCvGBbUhGl2KzrEU_bveTCOO2jGXEFk.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-ypQV7vu6wDoib-jCEYfPbtX6GwmNjEVJS-spS5eVdzo.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-aJ_vE5whbjt2I2TXyY0oH-moNTwm8vVxAMs1GELlh68.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-XOOV2F2gmMr30iFd-xjouJthiO3SUB4FVoCdYq-Fcgw.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pesw4b5Y6G1-tS6dtjk0Vm3Daiz7K4w_XIoHP5phDKY.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-gzKLNKes82x0B_guqXJBb3LGIFb7yJFJJpEXUHnJUXU.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-lbEe3BOpIFA9I1VtKBzhnKObbh9KzFdCCL6wux6s6V0.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-a3RALRAljaackvBaMTs4dETW1pgIgekAp5vMxSc1D70.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Pdtx5dSrGXTXi4KL-knHCMhFMXAvzyfCbZGeDTXU9Lk.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-flRBW0KPgEzk8C0VsCEd8tMyEhtUF-LV0sPqe4NWxfs.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-yVo4ubXeQb5kKCSXOKUI04JLbWGEurTWjwWv6oZczVQ.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-9MxybtM896bt3BqIBx8grPbv9lCyTcvbXVPtJOUcEZQ.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-UIMN6Sf-dIA6zXD5cJcDd-ImdV45VaU0bH8xIs1hL_o.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-m7luk0YpVg2X0cEcO08MAebM9LL5UYZ55LXMr1sSwrk.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-4zWo05mECtKON4PVmCLagRQy0qgUe0Qs5kVIGaRYkXM.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-hRq4o7p_W-jTcxeAhjbpet9uILl53aUsgWqGzw8iV7Y.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mjJl_6Rc6iYZs5yVY6jB9hqsaL1ZlmkRDrNKISXQfXk.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-BhhAeEa2BDiZZsDEKhfj-IFbKuiDHdGxDEeBl1StqIM.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-mcLvQdbAf0Q2PMHvxEPTmV94Qbp5l7_5e_kuIxOPc4k.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-sBSZriKfTF9qJ5jcyX6Y1EutSFU2JS0eK-J9RkIOjXY.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-y8NCZVAyUio49WOshCej7z42TYqgA3H9NfrmTlsoOJw.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-m-S-n4NdtKYohNQglzOrXukFo02bj5uRR3LEWMLUMIU.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-WfM0DFVViX7wmG_Kz27RWYKgdp7gahDurbhIudOskUA.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-ADKz2wkDaJb4gh_kQy2U-R6TqdhvFzLqL_TASkAwpDE.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Yta6ozGdTQyNAf6R7k-K4grlTbAmdGS8iUnnWIiPDWU.jar
    Aug 26, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Gq6fPVa06VCZg897bsd4Ztx5ETgyqAK-BJdiKc_8jdg.jar
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 206de20fcb81f06bd978773ad54df86ffc07854d3c20c21a5465fc5518f635ac> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IG3iD8uB8GvZeHc61U34b_wHhU08IMIaVGX8VRj2Naw.pb
    Aug 26, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 26, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-25_17_45_41-14962160500209164024?project=apache-beam-testing
    Aug 26, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-25_17_45_41-14962160500209164024
    Aug 26, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-25_17_45_41-14962160500209164024
    Aug 26, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T00:45:41.911Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:49.827Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:50.862Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:50.906Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:50.947Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:51.050Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:51.078Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:51.114Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:51.147Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:51.464Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:45:51.548Z: Starting 5 workers in us-central1-a...
    Aug 26, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-26T00:46:13.127Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:46:23.913Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:46:46.703Z: Workers have started successfully.
    Aug 26, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:46:46.735Z: Workers have started successfully.
    Aug 26, 2020 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:47:19.392Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2020 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:47:19.546Z: Cleaning up.
    Aug 26, 2020 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:47:19.638Z: Stopping worker pool...
    Aug 26, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:48:13.242Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-26T00:48:13.285Z: Worker pool stopped.
    Aug 26, 2020 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-25_17_45_41-14962160500209164024 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f41d5d28-67b8-4d82-b866-eadde007d2dc and timestamp: 2020-08-26T00:48:20.830000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.616

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2020 12:48:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 53.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/u5hzmjyuy7wy2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #915

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/915/display/redirect>

Changes:


------------------------------------------
[...truncated 295.70 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-uzjb1mYTAKv97qgykPufnWWa4RklJcqwIstK_9mrhCI.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-GS2V4ZEdLY1xL7CcyoQzM09-6wRWQAcjFAWBYQ9QWEs.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Lx9UDg5ChI7aTq5mhJBwwz9bsHNyy0Si6LCzw5mxEF8.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-NrsKtq9giq7kOx0Z_UUZNc5mf6KqNAjethUyMYGmVe0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-1gVmb-oO0KIIgn-aWNs6T8ZMTp5RSmTLjLWQjecLUqo.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-qRB6nybO9x-Nnf1W6FxLNt-H2ejFNychxxKvyh06qqI.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-LwCQbMHrd4TyZLVEDKxszIdiodesPx04lLRT0pio8zk.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-I951Xsc3bNAj0gvBG41PmIiPfHJZMoYWdA5JRaHj-Ic.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-bE98LyNrAPWvWWhQCmF2mi1t8J_s83eBq7-cnzrG2as.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3678059470051919692.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wmdI3cGSZahlDaxdyhN2OZgMp9Zg-PRr47kScRquVbI.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-nb2wVoz1ryfCJcdQZr67NjQX6ZO3GuSFXMS1ff_T-o0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-9o4ThSGBtu-dkBvZbgUuh0WJYNZKC2_MVIJSsAasivU.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-hKwZRsK9IRisjfjiEW9-HJ1pixStPqH0fVnXNXL0sP0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-TxNzN-LckUIwp70HvaCs3FX-GGjCgS8-YYMWOdofbkQ.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-1DkZGMAE4APqZsMSMcqbmYi-zmtSR24QaQZ6mgPofNQ.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT--E5kdnusZB9JXneFq4IZulz7WwHaYokdEGlus3X0caU.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-g8MN2hTlXdJZf2xl-IN_qTl18GgXfsiUSUdFDjMsuB0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-dshrrnMIuahDTHxJvdZFQX5jzD3r_3gz4_xAhMpLbTU.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-95Rbiq_uFZ_O4k8K56H8yvOrawB0tOh-s5qHjm8vKxw.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ROafodl0xxVRWj58oPwlgQofR_F-Dhhcx7cEQoy4xU0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-9iMOdYbUd6mJlKisArbDzH31WNa_Bz6uFxvKnR2RiN0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-cPu6s_ejwsWslwEtd_xiH1cgaH1wkh9A_EC4nJBKQvQ.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-NrsKtq9giq7kOx0Z_UUZNc5mf6KqNAjethUyMYGmVe0.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-QrQAmRLJcfhooetD_jCXzXmvkmHab7xsRp-hURR1PJM.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-tA01ZZZv5g-yLlc2UpqRqDTKbihTaNaE9dnh6myQ5Uo.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-dhzaC987SQ4ijIdQSYhvM5WMuzGwNP-iaw_8QOfB7iM.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Si567YFkjPHlb69zjGAQJDe9zYM6AoChw6G9JscJiNQ.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-AkdL9JNDaaz0UgxSCK8L9-tK1EJ5Di0gioVtOvt_UbA.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-FVxdqim9me5yRJSWJND-_EzDdfk-VAppPHYf09TIrbo.jar
    Aug 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-gr4yqV_yXZ1Y1q1SQxCRPreSagmpZvSfApVoJnVoCgE.jar
    Aug 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-JilifdO7zqjyZhlOs_MlWo5dXXLLYv8q6YzMQNtVrJw.jar
    Aug 25, 2020 6:45:24 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
    WARNING: Reporting metrics are not supported in the current execution environment.
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 6 seconds
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 95beecf4f4d63d0451d946ea0b6805480f5ac1577d47aa56e89cbd7f295ffa3f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lb7s9PTWPQRR2UbqC2gFSA9awVd9R6pW6Jy9fylf-j8.pb
    Aug 25, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 25, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-25_11_45_25-4954586048718118692?project=apache-beam-testing
    Aug 25, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-25_11_45_25-4954586048718118692
    Aug 25, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-25_11_45_25-4954586048718118692
    Aug 25, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T18:45:26.023Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:33.693Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 25, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.382Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.415Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.447Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.515Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.549Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.584Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.618Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:34.979Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:45:35.046Z: Starting 5 workers in us-central1-a...
    Aug 25, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T18:45:56.125Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2020 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:46:04.146Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:46:28.069Z: Workers have started successfully.
    Aug 25, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:46:28.098Z: Workers have started successfully.
    Aug 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:46:57.449Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:46:57.595Z: Cleaning up.
    Aug 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:46:57.684Z: Stopping worker pool...
    Aug 25, 2020 6:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:47:51.843Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2020 6:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T18:47:51.885Z: Worker pool stopped.
    Aug 25, 2020 6:48:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-25_11_45_25-4954586048718118692 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3105842c-7ed0-4290-a56c-b571f99b8655 and timestamp: 2020-08-25T18:48:01.370000000Z:
                     Metric:                    Value:
                   read_time                     11.58
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 6:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 54.455 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/cvzmdcc6um6yy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #914

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/914/display/redirect>

Changes:


------------------------------------------
[...truncated 293.72 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 25, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-jlbWFlC0drwkMzqcVu3OsDFvv__Kf6FOfbkzxC-uZw4.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-o0973benxsh88PD1oqaoxFPSLKPncRCFGhfAp8OSQs4.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-QYYV3Z3zsKNiHF1bA-wujWOYDZiV2xe2pV9Ve8e3FZk.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-c5aXAWspoW-ZAZzzaLV1x1b7mGywBZ4A60X2FLp12e8.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-t1C0PIMNKtyHHz7nvKyo0GlgdH0ayo0yRwaNtexY5_E.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-YG1a18mBdDxCEeOkU8w2-CossSIg1h4fBzL5UNSJBDw.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-xcmP3e6_vyzhIDYtmhUOb6jURRZnN4FfepxZczOhJKc.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-C8BuysPVVrxJGues5I0mV3yPwIahiD2UOtLyvZFHkaQ.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-eZJD9C5pcMggWQoX1orKpjOfuWfozQyt9aCl9nNmPOI.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-TJAeXEtjVNoDb-72FtL1jHA-j8dvFCCl9uAXq7YEEEg.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IupFSkDbhoKADNRpPbtywM0KBOLyHcBVtewtlQoerHk.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Rti2HIxqzusYY6_xgqBpGH-dGlyo9KiuV7roNN37d90.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-jlbWFlC0drwkMzqcVu3OsDFvv__Kf6FOfbkzxC-uZw4.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-53cuDRg5SbbMoOGwE6MxD5fmEFGIPiMdTIZWtGIF6Co.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT--Z3MECX0Typ0GxOTpZloeVr_V978fc3lHmsCCm-MyjU.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-dSl1EOGov8HmcCh-cClCbiROyFx3W3cyYwshIHJN5hY.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-JlYqE2M1LiAMBYEQzSf-i3F7WkBl3--vj8ujf5TXM0s.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-LupDCEratGBOkLOsKjRfu9Z8_4c-95qec-0XRhYVSYI.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-VWgZnEBkqo47aLQZrIDnYZvYoA6YtU6CZDJe_4ZA8tE.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-AXOXwoIi4GYrAG0hp7MgtKaRNDZytTSJpOEdSl9ZTQE.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2000386423626972872.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-i6Lwmj9pGmaOk0bH5Dt38CbZjZ1grcMlCsl1l2zBNnQ.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-0zNfVMVlnwl6Idfp_uGleEup8SN6Ej4hCLDYKXFvvDQ.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0jm8YaMH_NrAVRIFWg7F8ekIFpYphm6yh1EebW9n2so.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-IMhZHQZxFEsREN8qjcwsR8xwmZrt9lXxaSekQF_PT-Y.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-aJNar3gPXFhjaNOqGFka2nbtV9p95rnoxz3g7FqGtfI.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-AyMftE6fEdUdCQy8nJ_1PSt0TIWtPGjCEN7ONydeS2E.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4eXIgLB5QRG0Zm4QK4U78SeuhgAvRg3DSxtVPlcQVig.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-P6Qw23iJnIBAXlTbvCqx6RMjx1h6ZkuicUzr65IhZRE.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-dVzpeRffOYGanFWaE2exmJm7lUqKFfHvE1jMy6dGRYE.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-WwS3ZwC6DfIzR1wbHJRdIXr-nzcjzEBadjEPuRLJmyY.jar
    Aug 25, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-pZmCDp70kGZRaAcj_FBLvGUD3ZgHufdrbHOr6sWiaQc.jar
    Aug 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 0d2ffcf0f26bcc850f52e9e0376f5599fbd92116d7c70c5f3dd46f44762d92a2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DS_88PJrzIUPUungN29VmfvZIRbXxwxfPdRvRHYtkqI.pb
    Aug 25, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 25, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-25_05_45_19-18308979218500554842?project=apache-beam-testing
    Aug 25, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-25_05_45_19-18308979218500554842
    Aug 25, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-25_05_45_19-18308979218500554842
    Aug 25, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T12:45:19.436Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.114Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.787Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.825Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.857Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.921Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.943Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:26.978Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:27.007Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:27.473Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:27.551Z: Starting 5 workers in us-central1-f...
    Aug 25, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T12:45:41.233Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:45:56.497Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:46:19.738Z: Workers have started successfully.
    Aug 25, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:46:19.777Z: Workers have started successfully.
    Aug 25, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:46:53.695Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:46:53.839Z: Cleaning up.
    Aug 25, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:46:53.940Z: Stopping worker pool...
    Aug 25, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:47:43.641Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T12:47:43.687Z: Worker pool stopped.
    Aug 25, 2020 12:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-25_05_45_19-18308979218500554842 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 66c7b79c-431c-4c6e-bd32-d39354ad7893 and timestamp: 2020-08-25T12:47:52.198000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.722

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 12:47:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 46.303 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 35s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/l2lsvobus6ju4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #913

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/913/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10777] Add two blog posts detailing changes to the type hints


------------------------------------------
[...truncated 296.17 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 25, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-eB9ICLov9Nvj6qKlO2fc8FZjGtFKra5GhzZwKdEpeMM.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-sMr2wDvmpRKt6rlctcxOOLhwB3PXCEr4pPp29DNOjdc.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Qc_KHo_FwJf07-mMid5brwDj2TxzOOylXQ4tMrFX978.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mOtCOUiTAAmMjq4hXcxsOgIOU-txyt5WZEVdye_oTj0.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-7cuuPH1VAlGpTBj-Llfz5jg-mpVO4wxwlbHHcW03F8o.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-dK-HJ_WQQEFiO15JJCYKr_lj_Z3Iqxj_8uYQD2dbwfk.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bhg8YHetGLEquE9vlzzeP1pOAfJxzkjGSrPqm_1rNxM.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QxZfNo_yl9w1lEwjcjllyTcSHiwRrWgk4i6bS3vU3fU.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-kCmSDzdvTA7PvqqJbS4YF09l41Eml0XGQjfuGanINrg.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AKJSicvEU2gnh6cQ3YRudezXCekTF1d6pOPQsxdFngE.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lUrkZMJQyr8mJ9re9cCohI1Tu-_aPyWqu-ZxEGYE_Kc.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-SMZWdmL6FmEKgcyWB5LyLfDvK81WBqXKnOt263eO2mY.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lUrkZMJQyr8mJ9re9cCohI1Tu-_aPyWqu-ZxEGYE_Kc.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-2RR4E6l9FOK3MMtPn9TbneCCTSEVJnP9dZa2AYsdoXQ.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tqEcNvc4B0bYkpfTpyBPM923sp3oQp1wvt7mW1QJ79M.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-mLuseo9_1hCig4pKh97PyO9eZlITuF4ifaJKUwkAkVY.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-31rNoEVObrqGMyikzSEekhR_99gr9kR8CFaBrcH3cU4.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-gQ85eYzblTQWJTUdAJ3ul9Pmrt8Z04aPUk4eq3OkWEA.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-960xVPiyL7Zx-z7yruy9uGRNfNZyFx82aqHET7WDupA.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QH-vUnCoouf7emtG__iQp9xN0MQo4Bdsvjb1-xc-JSE.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5339072517683929576.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-QMeIRxBEwzG7vNGaifg7voCPi7PTuPPjJqHSaktC6iE.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-0JWpV1rf0DUuP71GqUI10awB-5D2dm6byH-aKcC_vGI.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-suTUGi5FTubQ0IsFiWD9HXtDH8DxCyDfBg4vtd9Ptus.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-0b5OUTwDOAodC4XgYCrkdQFBuMF732UqLvPSChCutyo.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-eiWcm13hpDW4xkSwg2Kp4xgnvjxc4CaPtD84XQkeavM.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-W1tOAISHClR5kGbB7UQVKu6n8N9J8Tg9diVmMKTxOZA.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-zq710d1foufygdLlXjzUZBxZp4KFAQCgj4rmMUEjJNM.jar
    Aug 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Ky3uL8MBaea1TrK_9V4o6cqMjBY9gFaLmaLa1K5qs_A.jar
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-0nYWbvfokpxug4Yu6HhAJCcnTobJjjEyLZicHWBen1M.jar
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-nmLTaBP6H38YGoCTtISH9LFnY2gkPcAeJF5PwBX0RLg.jar
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Tg791lvOMRtAq0tJhTWRuFCCYH9_YDfS4No-n6rUzBw.jar
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 72727d5112eabbf2cfb4fcac532c44a46d5539def5fc8e394c70410e47deb9d0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cnJ9URLqu_LPtPysUyxEpG1VOd71_I45THBBDkfeudA.pb
    Aug 25, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 25, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-24_23_45_18-5043346545259508054?project=apache-beam-testing
    Aug 25, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-24_23_45_18-5043346545259508054
    Aug 25, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-24_23_45_18-5043346545259508054
    Aug 25, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T06:45:18.966Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.160Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.781Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.822Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.850Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.915Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.939Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:26.963Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:27.002Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:27.437Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:27.518Z: Starting 5 workers in us-central1-f...
    Aug 25, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T06:45:52.386Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:45:58.450Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:46:24.738Z: Workers have started successfully.
    Aug 25, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:46:24.765Z: Workers have started successfully.
    Aug 25, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:46:55.983Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:46:56.140Z: Cleaning up.
    Aug 25, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:46:56.213Z: Stopping worker pool...
    Aug 25, 2020 6:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:47:47.964Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2020 6:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T06:47:48.006Z: Worker pool stopped.
    Aug 25, 2020 6:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-24_23_45_18-5043346545259508054 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2de4a9d0-7f94-4890-8700-fb1dc20f841a and timestamp: 2020-08-25T06:47:56.380000000Z:
                     Metric:                    Value:
                   read_time                    10.675
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 6:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 50.76 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mo7qdlif4avve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #912

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/912/display/redirect?page=changes>

Changes:

[Robin Qiu] Propagate BigQuery streaming insert throttled time to Dataflow worker in

[noreply] Merge pull request #12415 from [BEAM-10603] Add the RecordingManager and


------------------------------------------
[...truncated 293.76 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 25, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 25, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-J7j0nMUX0cM3IKk6NIjm2j65OnB6VsMbnAuEyrEpbIE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-IpgVv3KV3zVbPPSTX7lCF5pt4y_56y8S-TkiWzKOLAw.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-74UGewjq8iBieGD4QcbswjMPUIYZ17U42TmPbdi6DVo.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-h9ujrAgps3NOYNSsig1ndqG26WBC2QWSufMLHbRCeeo.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-KBLVXarD_I6t-pR0lco38Ec0Ji2HnL5sxq5IbFNjARg.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-sMkZTGfFXGZIjdoUjhTqJfIBOTHq15865tGmqrT7Lbo.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-t5nZvXv1uthjAXgs_Y8uqp8Xa7rdSsAJLcexNQUQTQE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-pCZy8QSZfxLCsboqQAYxNp8esTH2pAo5HB9BgZZs_2g.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-3ktM_1biKfjQAH1NAjdeUTY2BZ5ItwpTPUoj_H8Ate8.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-D2HY0IVgAmX5_AxR3Pc-ECYod3eoxBNxn4CfHNjcNEg.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-WQvANSKBc-bJd0jvUXmpZcBz6J3AnSZ6CnEOIWfDUsQ.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-zlFVannp2UTWrRlWWSJM4Dk0ClELirxOEnRRTK-jGJE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-0r7C-NrzqrKBekZSiVad6FxrN49Sf6zwKq4o0IVFjSM.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-leeAVEBCv9np1TPnk0ZoYAfkkhGuODyBS5b_KQsWOLs.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Jp7sdZ9hoLfC202B7sS1R-IxAejSh_VvVTWIOsJtUN4.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-J7j0nMUX0cM3IKk6NIjm2j65OnB6VsMbnAuEyrEpbIE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3B_Dklkk9cSlBc4NzOBFNxl73j9m_XUCcBsW8P9H5h0.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Qw3JdV6Yv_mZXwsm2AhrQCIS5gnDSAzT5TsDN3TVNig.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2490149785696202548.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tilqXwDKogv2AE5FmTyDdh240MgrKQkUzLzsLMlYBIA.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BjoULplfhAEe4CEvfkaoVSTFLJjF2zVNs7Nj3Tz5qc8.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-vm3Q8GzGkTtvTyDJjo7JWkMtwmSHPfbouaklQaGggC0.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-lg5T1v9HaqOeieEUs6vDLDFBUZZvNexSw1kzH3Z5GgE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--80NIS-5-AjFIQM3bxPNDGNR6bWREIGFlWXuvqtG4Co.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-LAQ8k-ttzxr9AF0jSeo61infZNNz2kHn7qpXapBkvcM.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-q7ZgQmVzn4IGAhUF-hdhv2nwDSWRH76Dsq3g8yjiPdU.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OCZna1AbPd0z93NP22FnqKhF35oXbtVZE3-5OiUS_C8.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-brpl47KBKIMMTUdPKms2g6eCxVFG8QImVVOYzMIF2H8.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-7ljkNb4VU8K6NdNMKa7-YdiiYVk1bGXQP6KWFA_yAgE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-P5DyhF37opKuuzOfuwptC6Jgy8qdjcaT5ohIR7yMjiE.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-ABwzcOqo5thPTA5wNFZGN_ybbhmFFVbCyGlhKTYrgQk.jar
    Aug 25, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Vvk-RuPNF9Luw0uy1RFvSWDv5JxZb2aRAN1a-4vRcaU.jar
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 6aef1c09ee779e93692b69b4f8b00e47a9a67222cee89aff06e89150bd42e35d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-au8cCe53npNpK2m0-LAOR6mmciLO6Jr_BuiRUL1C410.pb
    Aug 25, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 25, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-24_17_45_20-12694070687589557277?project=apache-beam-testing
    Aug 25, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-24_17_45_20-12694070687589557277
    Aug 25, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-24_17_45_20-12694070687589557277
    Aug 25, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T00:45:20.143Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:28.161Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:28.912Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:28.948Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:28.978Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:29.062Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:29.090Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:29.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:29.161Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:29.562Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:45:29.662Z: Starting 5 workers in us-central1-f...
    Aug 25, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-25T00:45:53.997Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:46:04.154Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 25, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:46:04.184Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 25, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:46:09.628Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:46:26.472Z: Workers have started successfully.
    Aug 25, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:46:26.503Z: Workers have started successfully.
    Aug 25, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:47:03.736Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:47:04.117Z: Cleaning up.
    Aug 25, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:47:04.191Z: Stopping worker pool...
    Aug 25, 2020 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:47:52.145Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2020 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-25T00:47:52.194Z: Worker pool stopped.
    Aug 25, 2020 12:48:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-24_17_45_20-12694070687589557277 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0d1f4ea3-22fd-4e6d-8d4e-a0267cdd95d4 and timestamp: 2020-08-25T00:48:01.881000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     16.44

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2020 12:48:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 55.56 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/5jek5mtktrilk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #911

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/911/display/redirect?page=changes>

Changes:

[Udi Meiri] Silence red PYTHONPATH warning from tox

[tobiasz.kedzierski] [BEAM-10682] Add workflow to run Java tests on Linux/Windows/Mac

[Kamil Wasilewski] [BEAM-10524] JSON decoder for ReadFromBigQuery supports repeatable


------------------------------------------
[...truncated 295.58 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 24, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-911ATAKIiBiWA2OyDTlwbH0xQ8B-Jbkt5JJNZJqXYeo.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-RYM7ArnuKQhFAYhS5XN8mCJNX2k7EYtiVlGBD98NDxs.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-IqsLvrnR3AtiDWbzPIleivdlG_RKcrragbjKwc3Dj2U.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-NfbqIBHRMMuQoGK4hfvxh9mrThJ2f2KM7DQhJjDnMJQ.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-r2ONJpNkeEvGYH8uy4Ggyi9mDXTI4btKZJw1XExqQxw.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-e9FVN3iI1Ka_FvOyLKfJPi18gyHv5lgBpakVOrtZcYM.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-hITNZ7BGtRIB_PhviZCzzfXqgsyNXgIrerkGhg9K4FU.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ADoqqrAotvmDiRSdy-b7r-qZ_4u-FlAMhmdV3dbK04c.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-tZ9qkIlSj7sVGcckZZZTsiKADQBZ0TsIFA53yowQ-pI.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-z49AfnB3UKvizETQEuYXXex-gw5S0lsjjJhaAejh0dU.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-IVX-4J3nB68k71g4Mmpz3bbbcRt4J1cEa5Ltm6IcfYk.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xsXWkH2bBG9O5T8LSmfkyPNZQqT3Tpfduoa9ZXGik18.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-RYM7ArnuKQhFAYhS5XN8mCJNX2k7EYtiVlGBD98NDxs.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-JMNVQBxQm2IJEKcD60jJkuO485zdjdzmUJNmB4lY5hQ.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HFkVq9hy-vDL2yX0NZBzcO9W-6jHAUFvYCD3OMrxfSc.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-w7qgbvJcPRQGAlcNs6LzEZY7hC4_Nw0gJrxYo7CCtgA.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-N08KajznDGhMiy693ULUeS3oL2PlnOn8-raG9CWDpNY.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-9vfP9oNH0BuSXOp1b3IYfngKSiRj_1FVrx2LrSbGtLg.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-VFPyUG7IFyOvZhjSA0TN2qSGNoTV4gSKK3TpWnkdX-w.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-OHpD4fgwqzDASkunLL4c4QWGfHbT7JS7Mq5IPgDn490.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ywjx7dVHVfNS_3RC4hvMSY7yvebSJFnVBe8J1TB8fzc.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-fTNG5sfi6o8_UwD1rgBCnEbbQCrfETjd6y7jNCnO67M.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-AWOm_33owAaroTYct95kLYkJpMmVkTiUCegVBwO-GsI.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7933593527796583170.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-S2k_riZjLbiq2KLrmXSxkIymeEvY7hNq7EoGq7SVA34.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-u54AvIQqcPceTIwz5AEDSQnw5USZIJXmr3ae7FPRquI.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-BVdowfybeN6PThFK0RBxQGArtqazoNnE8gttxbN0eOY.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-kTIkXOnG-etuZcVey3jD4UWv5d6s0b-aNw-HKyxu9us.jar
    Aug 24, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-OITgB3ofoR8buJDt299RBVT1XwFC3cgZizTEJ_69V2I.jar
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-vxpvlp7g6B-Nd2b9AIckE7OULQQNdZRKsqnqs1GdP7E.jar
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-uun2zyk9zbnUVr7erevrVb_I-ek_rAdIelQHvoErMh0.jar
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-5YBhUgXZEK_ywgG2W_-ePeowJXD6SJ3SnpAfyWLL7RI.jar
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash aeda67369c439012a3c03af558ceeb000fd9abeb621827327658c97df6dec785> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-rtpnNpxDkBKjwDr1WM7rAA_Zq-tiGCcydljJffbex4U.pb
    Aug 24, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-24_11_45_34-254140864726986791?project=apache-beam-testing
    Aug 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-24_11_45_34-254140864726986791
    Aug 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-24_11_45_34-254140864726986791
    Aug 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T18:45:34.181Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:41.975Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:42.710Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:42.755Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:42.794Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:42.900Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:42.933Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:42.970Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:43.006Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:43.437Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:45:43.522Z: Starting 5 workers in us-central1-b...
    Aug 24, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T18:45:54.944Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:46:15.493Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:46:50.252Z: Workers have started successfully.
    Aug 24, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:46:50.293Z: Workers have started successfully.
    Aug 24, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:47:27.732Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:47:27.909Z: Cleaning up.
    Aug 24, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:47:28.002Z: Stopping worker pool...
    Aug 24, 2020 6:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:48:13.952Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T18:48:14.002Z: Worker pool stopped.
    Aug 24, 2020 6:48:22 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-24_11_45_34-254140864726986791 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3ac9cff3-bf60-469c-86ec-4db03687f002 and timestamp: 2020-08-24T18:48:22.308000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.997

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 6:48:22 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 1.958 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 5s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/xotgers22nkvk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #910

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/910/display/redirect>

Changes:


------------------------------------------
[...truncated 294.30 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-1DTrOFnfwBty29Dki7qVF3bKH83RUr34xocffW0uGiw.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0vna9Qo_2oad-Rz1_6-dFsfv_xn94wnKf122vH3L_Us.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-0Ay3hdGj-uYK7u7YHlUi-30NWDGAZANt026KTRFt0HI.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-3Q9oPKhoDjjFs55KS1DoN9EGDPqVjO85K35GVJrhUOM.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ix3dymjMk5tjTDChbULHcG9vjrTUffOJdW3Yw3KAOgg.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-exx8D9MQFraV66GIEN7r9-r7aM5gKIc8h264-dgQV9I.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-IWaM5JE8ySYcUa9lidJAb7_Iq7fTSXb1xI4534-xzk4.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iF7Qj29L2MwF51dL9I_hZRBBt0-5cg9JhT1n6A6oCa4.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-FqYug1xJsafiYuX1VJojI-IEE-R_d3marMU4sSVfmUw.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-DeyV8ngZ6drtL8dTU---xZnPsu26cu8xC_IPXpYRpUc.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-vytL6nXzeNo0Zq9crlio65cPQJMGa4jGNQ3D8WztBpg.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3247756888833828185.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hS9Ogv1NbTGbJmicuLDPRBLBYg-rxnZ8SRQhHMXuF0k.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-hKGCseDTqCd7bv7RV5FyVlJ2Is-6YIVLqXi06-kMQDo.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-LJCGIaKyC5QZOxc_rwpwDM-gETha7pkYUoNZV3XITyI.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-r3rvVRlsF90lJWC_2EwjXWpx-hIhVRlkdvS8V8T1RSc.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-JYfr2GfgdIfB33NDUFpFBviLZcqbURJsKlpQOJ9t24w.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-XOVB4hfBMOLoI64An6tH98QSiHJMDloWTfkppgf-smk.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-eeoFwycHEe2vYwxf0GdvXB5cm6oqmKydYGhFri-Q_2A.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-28pf2EVvEasZgKYFp64tM-brScB0uvFpkRxMXVn6_zk.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-kOATW6Av1MhbLRWC5p2UicIFpIJ46eZewclRm14J8Yk.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-VCLFkvy1IENgGa7Vps8hNZ6xPqEFHGx9s6O5jlSCBTI.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0saXi5FWEFr4qDam-Gncoy81C4LtC_fVbDhssqgTMZI.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Gvs-bYh47HjUeWyZ-S3YTgLDBM78Ie44zX0RSNYzQ9k.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-_tiJ5XOMHrHVE-VlbpR6rC7XVYlgNl1mH9faKS6-0IE.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-1DTrOFnfwBty29Dki7qVF3bKH83RUr34xocffW0uGiw.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vioyReHDQakxGvmbUIthk7zn2hH2X3wP51JXUPkNAwQ.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-5OutSOlOgaejqVsYwgr9qSjahY2k6HFCFP_cnNOewdk.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-KVUvmhFVto6RxJCXsRuxqS8EoolMi-6PL_kcwqtSy-c.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Oc7JnGek_PqO1fIsunmvJ47-7reHf4NYgRr_tApFyhA.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4M8Tdb3XwVsT735zMwHNGBlSNeh0haBVKT2JWVh8OVY.jar
    Aug 24, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-6trNEBD3mxB0kuNrUCCcuh4y_ktRviD87pqq4vA5-lQ.jar
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 5e06bfcd096da31c77ff539ed637734a6b3f4586fa1039cbb7970e69ce9b516e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xga_zQltoxx3_1Oe1jdzSms_RYb6EDnLt5cOac6bUW4.pb
    Aug 24, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 24, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-24_05_45_18-3097192366153179394?project=apache-beam-testing
    Aug 24, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-24_05_45_18-3097192366153179394
    Aug 24, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-24_05_45_18-3097192366153179394
    Aug 24, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T12:45:18.928Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:26.863Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:27.843Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.174Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.206Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.303Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.352Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.389Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.426Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:28.974Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:29.057Z: Starting 5 workers in us-central1-f...
    Aug 24, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T12:45:42.150Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:45:57.588Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:46:19.424Z: Workers have started successfully.
    Aug 24, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:46:19.461Z: Workers have started successfully.
    Aug 24, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:46:58.244Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:46:58.641Z: Cleaning up.
    Aug 24, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:46:58.835Z: Stopping worker pool...
    Aug 24, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:47:45.887Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T12:47:45.966Z: Worker pool stopped.
    Aug 24, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-24_05_45_18-3097192366153179394 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 57404e65-7c8e-4892-b280-128ff258a77c and timestamp: 2020-08-24T12:47:55.864000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.954

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 12:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 50.704 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/qg72rbtamu6is

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #909

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/909/display/redirect>

Changes:


------------------------------------------
[...truncated 294.97 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9151776385866258916.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BBjWn-b3XppS1b41EUNDJTqniDmZ9oaXC7xJaQopDqM.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-zvStzVYKHKTfm79bX8F2d2xX_BA6mD5YTfJspChfDl4.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-4ZUgPAzIM8Ty6U_2I7CusanCvDZVIfw-1yRkE2rUvBg.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-inYhe_m7ONdgQMx3ePc6NCZZd8IlwtzP21wjqlMQLf8.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-YMZLBqhgfVlOOTQBO1jJVA_2TTAKLwoA3eKkGlhLPEE.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dlZeUJlLUIVDrX_Wayg5lXZxhuLnYbSmzeoTItynzj0.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-GIHDrs7ca7tyH6J1R8LlGlKG836cwybqTWZGtLWfR8Y.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-jFD4unbuiv9b5Um81hdd7U7YEjdtEypUtxZuMtdh7dY.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-fOZaR3KSugnWNvskJi2ImpHpxA-_KYjvk0UUXNbuLOU.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-B7ccYpZRwBh01VImQbfeRbvHZ6nVZG9pkYdSRwKeGbI.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-tSWWP-CPVHkSfXYM_DMIFFtkT7FzyhlIt12QS-XB09Q.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mjNSzOUR1ZY6xKV6tZRIKgSLnrSoBdM-OTIAqgfWtSI.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests--5d69EhhiNlssJtnjXm278iEFFy5mKFsfufmynl3We8.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-FJK0WxLIQDRErWuI_WXMVbZAmvMtRfHmnFJP6Yt0XP8.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Uvvl5B964p07Qm0XN99s8Isk4GnyZKD2YlM1NnnPzhI.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-nxh9JObDB4_TPyi_lyJZ35PwrPpEStp5OlpJupn2vCc.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dlZeUJlLUIVDrX_Wayg5lXZxhuLnYbSmzeoTItynzj0.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-RQp8YZHVmCmphsBdYxlfM8j2R95XCYqBhSjw1O4i9dI.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-sSkWep3kWXA-chjQWoCKdTt9vkJe6LCJ0aaXLaxwfUs.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-emZpu12YgRq5M_jJQ1qj6Be5fqSxfD2ywW9yo2LSIio.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-kZ2TUN14x1mZeVIUsEhykpxOBfgSFi5j8MrGlH2jRlo.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3x26L6lzHhqIf0vySrifYeI0GRIZk25lnMzv_X4rm_g.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-86NC39bvYE2WSUBPzBshCOE41z9ruIchetgRMbg0BEU.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-UATmEVdmeQeTk3fD13wqRzFwhPTi5qOua0xyWvfE_3c.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jQa97BebJwEMxm-Yo0CyQyYgmKerxu3KZu7tvkVQVRY.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-9cGnBjOE2XBtGghSuv-hTsepGeLN3aKsMtRvU4tU3E0.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-tmbquN6BqQFuIdk7J4MBCzCXZGnBBN_JQXEbKYEnBoc.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CfCteRMUTgZgdCPS0YLdTMVgkFM0Y_aYJAuc_yfR5dU.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Jn0_dTkA-Sj6PMvjC214ylDWHioRtHqeCIYDB4XQFKU.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3DBRii2_-DmMgJpyf7m01g13q8NPspahou3HjmF1mDw.jar
    Aug 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-TC9CrWfnDzvxcbvekzH9VR9f0GVjncooxTA9bMd8Ucs.jar
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash b3d02d93e8fe20b45d5449c86f90603e7cd612c892ec34e35d0b742145d702c7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-s9Atk-j-ILRdVEnIb5BgPnzWEsiS7DTjXQt0IUXXAsc.pb
    Aug 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 24, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-23_23_45_17-16028718382823394759?project=apache-beam-testing
    Aug 24, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-23_23_45_17-16028718382823394759
    Aug 24, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-23_23_45_17-16028718382823394759
    Aug 24, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T06:45:17.937Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:25.619Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.371Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.417Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.450Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.523Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.549Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.578Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:26.616Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:27.083Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:27.166Z: Starting 5 workers in us-central1-f...
    Aug 24, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:45:52.947Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T06:46:00.704Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:46:14.352Z: Workers have started successfully.
    Aug 24, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:46:14.373Z: Workers have started successfully.
    Aug 24, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:46:51.811Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:46:52.236Z: Cleaning up.
    Aug 24, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:46:52.378Z: Stopping worker pool...
    Aug 24, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:47:46.353Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T06:47:46.400Z: Worker pool stopped.
    Aug 24, 2020 6:47:54 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-23_23_45_17-16028718382823394759 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7f17223a-c3fb-4496-9e8a-6ef65c33ffd8 and timestamp: 2020-08-24T06:47:54.389000000Z:
                     Metric:                    Value:
                   read_time                    16.531
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 6:47:54 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 50.084 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/od6xzbkkjioyc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #908

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/908/display/redirect>

Changes:


------------------------------------------
[...truncated 293.14 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2020 12:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-NQFlEH5C3K8o5PZlvRahtlo5Yo-jjdGVMmklRj6rR4g.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-LMRLhGoUuTx-Ky-zfbqpORL_4oGIeclkawhKK6YMqu0.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-jIQLF4i_16fAHYRepGhKIBqtlqWiraTzDjyDFOiL5Q8.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VVrpsUcEQgRICw7MKq02t13kXGH1ZZeUF2YzdMtR6ag.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-IHqwzX7azWUTxJ5uNL3fYJP-ExMWsTVVN-S-Y2ctThM.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-oPuUS4I5ymM8ulNlWyWTcZTnw9Q-STHZlChrXztW9x8.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-P0GEIR_lAN2TCifCAby0IaimtOfGG-blLFKvd2zk8Sw.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-e0g0oy-11FPP87OFtvuofNPi5Th1zEzY6F16-gN-20c.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-08T2F55ALrVdY-MrSxc0vGboHVA_cDtPpQfHdhAv8oQ.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-extmmouAzLamYX2-Xwk1URLqmrc3g0APnVaU_v-IMXc.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-cXQz-zE1fNb5yeyBdWt2FEHNyKz6TuxpbEfYRP-vvfg.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-OmQWsoPWaCyGQFUZtWyi8S3Q7HCfNPdW-deTlrbC3Cs.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-urKd3J0u0-Q8AXg1ChHzz9A8ULzaKRe3P2bk4grUNPg.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-nLQwu6rTKCP8rFfse0MeDmSnK1bkUGBXRhP7TfoS7k8.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Shcn5R79iWC3dlIwcsZTtoRJLPpxm7GHxN_rd2vnQ5Y.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-NQFlEH5C3K8o5PZlvRahtlo5Yo-jjdGVMmklRj6rR4g.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-99HqLJ81LMvIupqJoYeDscivx767eOE6rGj0U2JNYis.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-SVadpK6Gw3cs15GtsuAli9n7a3ZBJPHLA_sR7RbRfaA.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-i8ontA_-odeirtsehObOXEEdZbENPQ4YZ_MkycljZWc.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ebmeDQUZlMHvkPRZR7wkQrt1nULrkXQK7xVD4wejR2I.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4453840213719532684.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mABmdrfI5llV7qoLfnQOGvWxMQb2O72a9C1FHGUjYbI.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mx1swUxgvqvDQWaFhMWcGTU-mNkRJuw6-MWbhB8t7nU.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-LS2JU3fD39mAANZPf1LorukNUGn1n1I8EwvHSpWwSVg.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-aPtHXAu8Bs0GJWZHItO1V6l7NJLbiElu3B42MFN8Dso.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-mzVb0Fgccyce3P2YNp2oFIs5fbavvg26uvoSC5hDTU4.jar
    Aug 24, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-G_8g3uUCrl0Qz8RF01kCMfJh84sLasbl9fdm5arU1yI.jar
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-swbu-cxmoC4FdD9S66ybsBqLKbrKwEKQ7gGs2E4_9Bc.jar
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-9i7WD7ZLTCkLzUZ5do7tk1LaN8x93lvFpz0WHULFr1s.jar
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-cnrO8RQV8t2ZMkDiDaqlfKTi9ASJB8WZUKL3wFmH-IQ.jar
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT--v5OgmZG1h2vbpg-wSmzJMyAB_-uAK6rB0DKtGe6urM.jar
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-eRC2oQQRCISroxYiWo8VcGsRXzYiJsZKFjd0zDx3c7c.jar
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 24, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash d705d4cee4042086072b38459a09ba9a602164557dbf4071a6a1489678ec1d99> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1wXUzuQEIIYHKzhFmgm6mmAhZFV9v0BxpqFIlnjsHZk.pb
    Aug 24, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 24, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-23_17_45_22-4628370301406591535?project=apache-beam-testing
    Aug 24, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-23_17_45_22-4628370301406591535
    Aug 24, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-23_17_45_22-4628370301406591535
    Aug 24, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T00:45:22.474Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:30.679Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 24, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:31.838Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:31.879Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:31.907Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:31.984Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:32.012Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:32.048Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:32.083Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:32.512Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:32.585Z: Starting 5 workers in us-central1-f...
    Aug 24, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-24T00:45:54.111Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:45:58.230Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:46:25.631Z: Workers have started successfully.
    Aug 24, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:46:25.666Z: Workers have started successfully.
    Aug 24, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:46:57.436Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:46:57.613Z: Cleaning up.
    Aug 24, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:46:57.712Z: Stopping worker pool...
    Aug 24, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:47:51.324Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-24T00:47:51.402Z: Worker pool stopped.
    Aug 24, 2020 12:48:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-23_17_45_22-4628370301406591535 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7fda6b09-3e0d-4611-9509-d8656d130684 and timestamp: 2020-08-24T00:48:02.904000000Z:
                     Metric:                    Value:
                   read_time                    12.422
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2020 12:48:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 53.876 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/bip4jyjym7rx2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #907

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/907/display/redirect>

Changes:


------------------------------------------
[...truncated 296.33 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2020 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Vhh4pBUL_KkchzXhJxlj0JOTruqY-t6-v5Cwj59jCw4.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-1dt9f4NS73LV322JWg15vY4Ah3VtE6x2n4tUcbLGT-U.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-k1_7Ic-iqgba3zj6_Sa3z1VXZP_lEWkTxQUdXuI9-LI.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dt_-sAVJcUtE6mOsHSccWWNcCLijuKSzHHefnDiuREw.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uchEgSzQS5s7O5uupga_Hw__0yVmGJN8d8nSokihfnw.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-MxoXsstjwCpa76m78sxSHN5mG6lgkIjP8UQJZpM2PqQ.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ibi3EF2-HxjnRDuY8kXz1KOFN6jCF1m6UMo9pJKm66g.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Fyf0oTzyihpwwD-v1PEHBHLSkrkhVnlxwpcvGoZfDt4.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-DxEX5y5viK8j0ykbXqV6xNQP3iZ22QRpA8_dVzhhHP4.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EOs1F8x9sH3Zu5egA4WTepYjWU2TPYM-ooGIrEsbEno.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-ThmmyVf2uYvhO8Bn3Or-sEiaLn76oOOHCJPLgPI5GbY.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1104912797193586415.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JrRA7vxXAQbmy-l2rbNck7qlPm-fdsW1udtpY-3cNzY.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-lTSlkgu-1HBw_oyBT7qyCPZ60XvasyiYEBwGZUwn4tA.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Xpjqd4y8xWlNQfARHkjtMD_vVYXkRDOxWeg-1OwQqFk.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-eBnhhv3guJJx7I3Y0bkJTvVEBtmjMC5VAcZSB0SiHl4.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MlVONdfKuyipjXoD9TLDvCQX1328Pjo5ozrPgEZJHAI.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-SXY6DN7bXmXwCdj7wagPOABYL9oV69fAYI6PVrHrTUs.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-snI31bLGtxouAoRDovcDAufG2z7oR8JjI7mNzKZPMso.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-TxGu8Sa3QIPx0YHLxa9l2ku9EKlMosnpDtUBNly9ko0.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-5jOP-XHcoRXgdf-6814dcroqqTt5BmR-zjzB_9-pvM8.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-gF2y1fIWbWL-DmRqDZlGbFJGdpDfTizfXgBAncEEUok.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-lyOE3ZdFreloQ3cVCghkv21Ib0RwT1Bqt8oTPitXiwA.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dt_-sAVJcUtE6mOsHSccWWNcCLijuKSzHHefnDiuREw.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-OFl-q11RFNj1J_7Froy4XI9HKHg0i8bsp-J0cLmUxJs.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-hH3UqA61b0mJoJOyWlZyvJb1hddsuR1MzflOErvATv8.jar
    Aug 23, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-q-wpsm4HmTc2Zyw7cTIdrd_u8NIn_4iOSj1oySG31rc.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xZApv4GLWS_lqdNjhP_4iOeXNUGXjj1v5PfT1mAjNRY.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-dyxE6S5Xd2XxE6thuuFpB-9aapey_Oj_bfOZ6MOJDiQ.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-2AFK5kt7HzFobAdaZ67JrnzeidxpdsoE4agTe-5UFTQ.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-ImJWFFXqaLPZcHHSVLNTP8cWxYq31NZnq75r1JKU_zU.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-z9I2QuY3XANaPmBIY7a6hJx_0IxD37qnzd8KgjmOUH0.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.9.1/d313237180bf9f2f82e12f503d9617e6b070f792/mongo-java-driver-3.9.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.9.1-mxKxkvmYluxV-Hdn57uyt-MjjSQUsFjxFw9tjhx0bm4.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/4.0.0/9b3a11c613ec3fd3440af4103b12c3de82d38b6e/jna-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-4.0.0-2sJwtkQc4k2TqW3bbo-T2N8JkZJzh5mm9vz8KyQWyhk.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Aug 23, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 164 files cached, 46 files newly uploaded in 2 seconds
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 3b2d7bb5b18cf17f44ce658bee63f51d4f7979517a37f01bc82eaff077f28ef2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Oy17tbGM8X9EzmWL7mP1HU95eVF6N_AbyC6v8HfyjvI.pb
    Aug 23, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 23, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-23_11_45_19-7829927695676864953?project=apache-beam-testing
    Aug 23, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-23_11_45_19-7829927695676864953
    Aug 23, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-23_11_45_19-7829927695676864953
    Aug 23, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T18:45:19.124Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:26.403Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.269Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.311Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.347Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.437Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.475Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.682Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:27.718Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:28.137Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:28.212Z: Starting 5 workers in us-central1-f...
    Aug 23, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T18:45:33.531Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:45:54.024Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:46:13.616Z: Workers have started successfully.
    Aug 23, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:46:13.650Z: Workers have started successfully.
    Aug 23, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:46:55.053Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:46:55.188Z: Cleaning up.
    Aug 23, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:46:55.264Z: Stopping worker pool...
    Aug 23, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:47:48.649Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T18:47:48.687Z: Worker pool stopped.
    Aug 23, 2020 6:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-23_11_45_19-7829927695676864953 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1fd9e84d-8d4c-4a90-8335-852021a108fc and timestamp: 2020-08-23T18:47:55.761000000Z:
                     Metric:                    Value:
                   read_time                    19.172
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 6:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 51.566 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/rwkmuybfs3iow

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #906

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/906/display/redirect>

Changes:


------------------------------------------
[...truncated 295.95 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 23, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 23, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 23, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ODF7fRqXjvFKTFLhkviNf5teW2paqWTGxlQsrRBUzrI.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-0_KefqqzLn1dZvWyxveEMOCTelK9aC6fSxRY_6PzrWc.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-uD_oKEdnAUMjsK3BNcoh8VJUaBVPLfWJgUK_Be0Xpxw.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-mM8z40nEPu9muvkAunY5VQ7JOxsa1N6bazWg_4CkiOM.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-iVn8odjnbSGlwKLJJtvvMQbqNLnDDpAF64nGVzULfWI.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-3N3bulZSdGRPkvnsyIhJCXBeKTwyeQWptHWfO8j9n2w.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-1jWgeoS3Icr03YQFlf1TzYbk15-yhpTUzjC8aykuqQM.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-iCDEZkJGS_WLlK1om1zxORQg31Zf7zQTlgADOtbwT3c.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-w5BHgqRqHXCfZOzye-Yu8YqyciOlkPAiZSUAr7zSZxQ.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ggNZO_D9Qnt2uUvcSe0vyQvqAj8DE03XYlyLjmQ61uY.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-LdFjyvYHCTSW4MuEaa0Yf14qRkGudc0FJUy9CLFgUF0.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8750434063407446686.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KgdVP_x7cqGzVHrloIToLBRc8Ksa_3ynRB1bPwV0JFY.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-FbvKG7LjP-e8L4XZOYIhcgXscHGDUkELVyReffZtZ9k.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-msoEJGYVJS_WwQ-roPZ0rQlme6T3WyaaDFp8DbELie4.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-U9jbDNoAJaaEH5yfnTS085f3bKytn0AV5P9VpvsvMlw.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-5eLAKN2nz8PBKS8pGXl_wf-GIr6xIGmrOD5wuCYd4Ws.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-YIJRKQUVtfAIa8wqgnRIxtIjHSJm6UUIhcbElIl50MY.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-XakgKpcvnPzLSPnoMr7wCqt4qwz5ICG977_A_vENbRQ.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-APN0HMEyR4qGPdwGJhSCK1pVoQebdYby67FZR6CFLLs.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Ss9eo-Bqcz2yBX6LM_WuC-xG-jAVj7k3vRn7KKThwYc.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-419SqFxd-FIPLRW9-M9XqMwh6fcC10B1AwcbqG6hSfM.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-JHOTFbg7zBZT2CmixKTqsJ_atCO3Q8-iEHIrPsCXl7E.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Wv2fKsnlDYGzE3KFDfYHvRRihAPCjCqEZc5TPjI1HJ0.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EtqUnFmcecSpLl54w2rLB5CQSRgMPGiEy4l4P4ZDOKw.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-6TubXQ1XnA6E0a8O5mnn7rGwzeYmTsfnahroNJgd08A.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-gpA6k5-_aK_HOQdY4AxGKVx5Y5A3y81w5V0rax41Q3k.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-dh31qegmJMqbUTmNjh9bfsy8TQg5TdCT61lSXx-SIg8.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ODF7fRqXjvFKTFLhkviNf5teW2paqWTGxlQsrRBUzrI.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-aYkQiNmL3HRyqAusgfLxtCrvNNVAQVyah8hecPnTkSA.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-BWToj30umfjzpnsyMHAiRoCtDl-Ef4nxdTDIXyBq_rI.jar
    Aug 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-1QK599VNiSoy-ITJvo8OETjqk-vA5sI7HB4nOOIhHEg.jar
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 6390f92448678ea937f5716d388daf396ffe23e26dc44d3811a494e8351f6264> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Y5D5JEhnjqk39XFtOI2vOW_-I-JtxE04EaSU6DUfYmQ.pb
    Aug 23, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 23, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-23_05_45_21-1159982432058206214?project=apache-beam-testing
    Aug 23, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-23_05_45_21-1159982432058206214
    Aug 23, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-23_05_45_21-1159982432058206214
    Aug 23, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T12:45:21.096Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:29.137Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:29.917Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:29.958Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:29.986Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:30.065Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:30.101Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:30.136Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:30.171Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:30.498Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:45:30.580Z: Starting 5 workers in us-central1-f...
    Aug 23, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:46:00.996Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 23, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:46:01.030Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 23, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T12:46:01.755Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2020 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:46:06.255Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:46:30.094Z: Workers have started successfully.
    Aug 23, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:46:30.119Z: Workers have started successfully.
    Aug 23, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:47:01.392Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:47:01.527Z: Cleaning up.
    Aug 23, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:47:01.605Z: Stopping worker pool...
    Aug 23, 2020 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:47:55.116Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2020 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T12:47:55.164Z: Worker pool stopped.
    Aug 23, 2020 12:48:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-23_05_45_21-1159982432058206214 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ec66d5a6-1cd8-4a26-a7eb-c8b8854573f9 and timestamp: 2020-08-23T12:48:03.390000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.791

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 12:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 55.656 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/l4j36y3ar7xe6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #905

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/905/display/redirect>

Changes:


------------------------------------------
[...truncated 293.52 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_xKEf5wMj_7tKfIPp5mUslcPQ4MgNeYr5PKgTCitY_k.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-nIf9zIoJ3AVKeE49otPLX-aV74rJW7nZjwogm5N8kcU.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-8GFEOgLy5O08SdkUyLAhLwqaV1iyiOMYTSgfeCXuP94.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3489515066731056798.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lYa0ypkcUVD-txJ17u-4rmS9o7FFN8jGYEeMAW2_JJY.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-D0JkvckK9sn9Q0Rt0WQmNkfI-8C3Uo4sNakSIyuij2s.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT--mX2-ObKgjibjQF6NuNKl-PjRt1kz5qcjzbVpyzRIHw.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-AZMLqdbvHR9Htiz4x3D_pnAk5cuezg4Am25D9bV9LLg.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-i1_kGWHfzudkPnisyDRD1NV-TkSBeNGe-5j4dQX2B3c.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-xK_avUGBqj26y5RDwAAkKsA_7QwpMCz54vMDqK_Uc8Q.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-gZTMMTiLYxfc4NLQlxlmOLO_phVgbtc17Zh5jSeQ8-w.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-VNtcxIZb9ddTCF2J7nqKeBrweOa4-tcob743bzsBzkk.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6o3L7jqfi87RJG4uORmO9EAe9LAmyLCxRu-SXZJj_7w.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Fh0_WUzLfyEm4D9xd6kC2HBVcIQACUJoyJ7G8axeDvo.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-DsutZ9e3yhL2m8uFFTMhP5CndSqqMZcZvD6krjEBIIU.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-8XmG7cmOlx9HjJPw5cJouWCglRA9q-XPMwNZWeut17o.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_xKEf5wMj_7tKfIPp5mUslcPQ4MgNeYr5PKgTCitY_k.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-TtDETaRCMx2-JvUNCtMqSWETFUoC5oYLdUMQL25Za8I.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-aPVAkdueP7FYjiv8Gb02X3HxcpFHrw5T8Cv1iCrUri0.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-hneKbkaMaQ2AYAjzQraxl-IdtXgAlbvA6vCFhOZ_48I.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-m6Y_QsEoeyR28k65tkCGTkqkHR3qAsrxGBvEnHBQomc.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-kn3-fX1BzSV8eNuiq42COlIhgAjSQllLw2idHwe7-8M.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT--mmAA75JBeHwq-VVHGPis-hvD1G-qNQun2HiqqzdT6Y.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QYJ4C3m1eyxAVq7aeLqUzDTct6XssL7eWq5iJUX-dt8.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-2ONDuvEjZCdQbhEhnjFz-ga9-Iyh1V65xwPe1CbIZVc.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-8MUA-UFpuma5YnXDADk18L2OmhpELL60ou5YETkqdvE.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-VavrAVWon1xT23RLIMZb66x9KnFRFCqYIwPZ-vkYBAQ.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-mzFzbPzHjWw1Typ99slnon7mMQPI3kp3sDteotxW3Ik.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-iCotslpi2IyB8qk5c3MnKW5azIQopj1oUamPfE8bSnk.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-BCMzytn2wOcelqj3eYyS1AAFflF608BU3zAew9RW028.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-2SEZpQWZogwi4hGZUohR59f9RKdQNLHZtNzGh3Ql8dE.jar
    Aug 23, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-s2-r2GDgFLwy8_vgqQiUsp0fLHCJY2A_RiLXB1ZLus8.jar
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash b3531048591eef9f44a52b489674301b9fd81041e4c5a81800a066ceb9fcc7a4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-s1MQSFke759EpStIlnQwG5_YEEHkxagYAKBmzrn8x6Q.pb
    Aug 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 23, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-22_23_45_16-11508128442199715499?project=apache-beam-testing
    Aug 23, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-22_23_45_16-11508128442199715499
    Aug 23, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-22_23_45_16-11508128442199715499
    Aug 23, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T06:45:16.786Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:24.453Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.137Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.307Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.333Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.387Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.423Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.460Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.486Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.814Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:25.886Z: Starting 5 workers in us-central1-f...
    Aug 23, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:45:52.521Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T06:45:53.953Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:46:13.497Z: Workers have started successfully.
    Aug 23, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:46:13.533Z: Workers have started successfully.
    Aug 23, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:46:46.139Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:46:46.282Z: Cleaning up.
    Aug 23, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:46:46.350Z: Stopping worker pool...
    Aug 23, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:47:41.513Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2020 6:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T06:47:41.558Z: Worker pool stopped.
    Aug 23, 2020 6:47:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-22_23_45_16-11508128442199715499 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dd5f0487-2444-485b-92b4-8f2e510d2da7 and timestamp: 2020-08-23T06:47:50.359000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.909

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 6:47:50 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 46.99 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/glt5u42oivrlo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #904

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/904/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #12653 from [BEAM-10758] Centralized Glob Translation


------------------------------------------
[...truncated 298.66 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 12:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-fbwWz2sq5-oy0qlJ7N0JoFbS3lddgSBFKCuiSFipaUI.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT--Bkno5egrKKxVc2BR8fqRLBsa_c3Ft4dTkXuvSiU9z0.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-PJp48Ar0psogwYk7H--vQCw4uq9yvFMNiJfoDZmVAW8.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-rpcxHOB0iCEXC-mI90qcki8HBc1RZvSVzYvxlmeOuFg.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-AJyBWF0wl9YnMO1Pu1hXWRZydOV6zStBUUnC3XbCC5I.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-rR_rx8aAI32y6RJUNFwKIvfNzvXQe-Pr1uJBzMPWY_w.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-523b8UVIHi1YZXD3sJf4SBACqsASlWQhlbPXiFi_VUI.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-jpgrNfVLPbaLBkq1HsaxMXv3QvD7uAyVCCV9SOno8qg.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-u5qWBbKRF3ZQd65womDlMgc-B817W3-avvNdZH-Ym2I.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-x41kqdVDlMaKJ4G7Dp9KOeXugMH75JzmLs9OybcqbyE.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-FLJKM74r89FUxvKTshlGILkbKCA8LOK3ODZj0Bj6WPg.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9W5oTAXZ-4B6caV15Ot0g9G_KKuqQYNTXx5Cp6A9Mq0.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-jI8sNqPv1HSQTBVErkIDsts_Upq423BdBQ8d0pWLZts.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Dl_HJL7tAf8H5_O_eQDOuxNt_nkOI136gGwGa6E7bPc.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-w5KHYtXhZuJoDCh3L-cWdRc6jEfQUzdl2eOCMcgQ8KY.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ivqvWUeDEQI80CEaopx-rWP6ZzHBbH7phensPLILHB0.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test552003310022535400.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RkokB7V7hFVJW33MPvgZdsD9JnBFRgUoBY1nRvU6xUg.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-50AS1nil1DO6FNrwHYfphIpfa--ceBJJJmGAMnmh47A.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-fbwWz2sq5-oy0qlJ7N0JoFbS3lddgSBFKCuiSFipaUI.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-j3AJQAmJYXz5hTBWsdtbXlHvL_jOfTPGdvO1kvf9jBk.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-6FKT5gEN7cQnfne0KXi5qx8rbIfv68KIDEHLf_h2Elc.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-88ypT6fILw-aDCA7bEH1Ehi06y0KUVZP0tnDSFu49pM.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-fSCPWg91XJDTM_Jq5Ack2Lb8K7wTkfePTtHF78AZ9Ro.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-KTsfEI1UVAlUeFuzsvEvSxciG1PtyaaO-R6Ys9-bKDM.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Qy0UhnCGQ1qMUbnW6NLShprbu4nD26Ubsd-XLmdsHGo.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-gY7G1ftdP0MEVhX8IazkOlrYAT3S_0Z4H4ztY9uxZK8.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-bQLiWWPQ4YVRcz4v-15cUkZ3nKcuKxZ6tULmBulCyfM.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-PcFNOKBdjAsWi2ZAweCzAW9ZJWD53OD40q20r9cjlV4.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Lqf85DOI1X9KnryAGEV4BJCFf2_hwTrI7UCfR5dIyNQ.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-HORn6Yied8lgCoScgSpR5EZd6kZEpPFa49RZ11K-67E.jar
    Aug 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-ibmShkjZVCTY4cUKmLJusiRbl4PWlG8CQr6aLlPFqZo.jar
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92127 bytes, hash c470469c4d808f678ba0d96f514bc6094f2498e5ede39ff1affbd2a12152b6f5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xHBGnE2Aj2eLoNlvUUvGCU8kmOXt45_xr_vSoSFStvU.pb
    Aug 23, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 23, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-22_17_45_39-1011915456868889536?project=apache-beam-testing
    Aug 23, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-22_17_45_39-1011915456868889536
    Aug 23, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-22_17_45_39-1011915456868889536
    Aug 23, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T00:45:39.222Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:47.872Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:48.799Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:48.853Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:48.885Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:48.955Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:48.994Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:49.026Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:49.060Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:49.475Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:45:49.571Z: Starting 5 workers in us-central1-f...
    Aug 23, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-23T00:45:57.814Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:46:19.380Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 23, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:46:19.409Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 23, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:46:24.770Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:46:46.372Z: Workers have started successfully.
    Aug 23, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:46:46.409Z: Workers have started successfully.
    Aug 23, 2020 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:47:25.737Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2020 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:47:25.955Z: Cleaning up.
    Aug 23, 2020 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:47:26.075Z: Stopping worker pool...
    Aug 23, 2020 12:48:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:48:19.151Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2020 12:48:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-23T00:48:19.211Z: Worker pool stopped.
    Aug 23, 2020 12:48:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-22_17_45_39-1011915456868889536 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f5b4f1c6-ea38-4c26-8dbc-b4a563481511 and timestamp: 2020-08-23T00:48:26.951000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.212

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2020 12:48:27 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 1.909 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 9s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/j2sbtcy3jsdv4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #903

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/903/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7587] Spark portable streaming (#12157)


------------------------------------------
[...truncated 294.72 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 22, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2020 6:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zXMa2YEs3eq_dSuiF7C277ud-u4SZ3Xylr8XGl6Xigo.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-mkr8dyXGpPuKHJTHzX0C3ToIffhds5NHP1obdb2WT8E.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-_8xGyG6FMpsDA4HYkEgC2DmWIFlwy-NSMVDE9F70Rkw.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zXMa2YEs3eq_dSuiF7C277ud-u4SZ3Xylr8XGl6Xigo.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-sdT_mBrvjNkiXMk33jv07-DCriI64jl8j6LxH5A4Z8E.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-P_hIyerE8jC9UO5RXkQPm1idyr12keQTfq-ezLUng7I.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-C6aIxPq24I9Mej9Dud2GGW7OSy6OhKV5JV_zjbZooR8.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-0G4u5mhnbyzu6s9Yqgtrir_LizRoWRO95Qyye2UNdEM.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-QjctEmkiqr34c1iksNyZA1wkcj8ROnZ9B42WEltFna0.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6210791092880861479.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KjMlKU9O2dTu2HIGVmH-4C9DdwjU5i1q7RuQWhRw1PE.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-xVi3WbT_GP88K7_ndir2GhQOOUwSu7KVl2JnNjFtmJw.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-fewcJaQKa7vLHPy8oT8cMP-qpRSwBr1q9Jilxni74M4.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-IhCL_wzbr6lSAGtkIsp3rhWqGrhkD5d4smspXtl4cGk.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ZgabKDZDkM2L0HXhUncxNBNVHTieky36vSVvI35a9rI.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-1wPkibDsvMd4DhPGwCgCTpbVdnkD5YD7_DexcNVZNUM.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-NN_HrBYDdJ2to5m_8xYCwiJAsO7YASNBS8RnhsHAK9Y.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-TmWKmJb73fq9xkKlJs0Ey2MQnk-Unm4KVDLz1gFgbkc.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-moTCDS24kPw8mkCZqSqRYgwNMIVrsjhLwzHKphZxVVI.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Pt894WIRvyaN4Isof_jOKF4fUxHTu_UCDXfzGtOpazM.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-kRIuoEuJGnZ1TiNWwlEqK5u6TxFOI5uBT95LXYZI6Pk.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-KwoDBBDUKijYgebnepkZR4wN7IGhRprk9jYfikdTgmk.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-61L1VyJvmeSjsNnW9-JmKBp8l5SXyZmlGX1ShJ-ZVWM.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-5KbK9VmtfjZrU28Voi3K2TLIOESv-mvHLJgU62F2krE.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-2AtYKIAtAtvJJMon-YyrXNgbZpbTJn9HOVwLeWwh1Uw.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-LzOWstGNusFZgRPPp7C7IIDZILgFER42YsM_8ncCwaI.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-la8XrD5vBxomxx-LQNKBjdkpon3569Kmjw7mZ3T55DY.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-wGh7dpwMp8L9yB59qeXu1eMLjb0t55U_BlZMpE8vj-A.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-a2L_ZnYTeQLIp3umdqQQNewJ66paLIFF1UxCFXBoph4.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-QmwQxaP8yto3n8b-iG_1q5lFSIz_RSSsYJGCAZCPE8A.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-DP5e0j7G2TOWNSUMPzJlcq1hYDhbbkf98NPbjrUP7qo.jar
    Aug 22, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-in7tfi-C-hWsXHREXPxoWr8e_HT_db9bnzXxJ5NwAjY.jar
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash f58e5be23529d15bb9947cd34d7397a87c574b6cb626170f2901f4aaa0847543> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9Y5b4jUp0Vu5lHzTTXOXqHxXS2y2JhcPKQH0qqCEdUM.pb
    Aug 22, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 22, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-22_11_45_41-6683024959670157100?project=apache-beam-testing
    Aug 22, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-22_11_45_41-6683024959670157100
    Aug 22, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-22_11_45_41-6683024959670157100
    Aug 22, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T18:45:41.802Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:49.734Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 22, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.438Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.482Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.511Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.588Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.623Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.650Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:50.687Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:51.093Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:45:51.173Z: Starting 5 workers in us-central1-f...
    Aug 22, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T18:46:13.194Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:46:18.485Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 22, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:46:18.518Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 22, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:46:23.820Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:46:38.702Z: Workers have started successfully.
    Aug 22, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:46:38.741Z: Workers have started successfully.
    Aug 22, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:47:12.331Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:47:12.454Z: Cleaning up.
    Aug 22, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:47:12.565Z: Stopping worker pool...
    Aug 22, 2020 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:48:02.426Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2020 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T18:48:02.478Z: Worker pool stopped.
    Aug 22, 2020 6:48:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-22_11_45_41-6683024959670157100 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ea05adbe-a67d-46f9-98fd-b7d8a4737b5c and timestamp: 2020-08-22T18:48:11.142000000Z:
                     Metric:                    Value:
                   read_time                    12.826
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 6:48:11 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 43.006 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 54s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/i2vlgvdboztms

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #902

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/902/display/redirect>

Changes:


------------------------------------------
[...truncated 293.76 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-0E6qWSCvwqIxNtG_Pd6wcevlQ8E6VLcC8tLoANhEaVc.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-0E6qWSCvwqIxNtG_Pd6wcevlQ8E6VLcC8tLoANhEaVc.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-xtWcOCw0FqEEJriWx-oTxp40dl84fnaC4x-Ps6Fkufs.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-FWkswNUWDVMjKTicYbGqUtj7SfyfooAvG-Qtp2qihAA.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-iQUU8I_sPkSU3MkJPhPnsgKafU9NgLj4sz4uTZH_D_U.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-KahvpuU0UKQIRjDfXu4TR4hiOEj6jHyWW9HqE0onDK8.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-ZLpKQTOxiY8PuvFOCcjvfiWmuRsImoIWeewQ1GKQdC8.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-zbg6slGKlev7J0Gs8zM-bTwD-cc_gHDByHw_hvfPBkI.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4671933870627386278.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-A1nD4w87joi5d2ECcUIhd0E118HeHu4wG9bE42HTuQk.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kbcCMisW8RH2LsTMg5IhJDGye-57L2sQIyD-j5tikns.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-b75qzaCEviFwykIMMkyGc2g5n9CxfYGd7e13X19T0Zo.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Dz1VdQ8oVEsg9F3eQ_wyK5z6CYf2FQ6YiizKV-zmTY8.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-SMxm_BU2Utjj-g9l1nvRcIwKhW3-qu0e-UkBb97M6KI.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-f9oNmSDQgOjTONamuQgQ33kILBE-oVJS5lS0zdS1Vhk.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-rOFt_nTE__To5lMEeHX0yhnyZNVKPsbsiWn2pByvYUI.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-M3ufUGnV5loZaj0KiCD4JWmkD5LQcYHtJeVHuxWuWlc.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ASPuM_5XD_ViRJqK1CHyLHsbepsmNfyw4Ul5vPI9i9I.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-mrjo8Umwq80qJmmI-foxkbbvujBfLI0F11BHxScmI5k.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-TQR1wot9GlOivoELTLnvU2pL0gHwQgxgkMfju2c10L8.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-z5b3e7YzaUuj7xqwSdw_n29mVr4q33oIE42zJxNBcAk.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-CfvxtMNzaMCX5TyR5cpwON_mJVqHgjVwIdtPKN_IKPo.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-sAse2c1neQXPJsWaddbZf845LDO3XVVj7sw6YrL3q_c.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-gjuHplkAun2dfuf5Sr5K9GThEj4esGPCQqhGZV9GZpM.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-7D90pHM3PikiMCdgJvv2-uKXKc_BR34kK1ejI0A3HYo.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-EBve42dEWuwh6Qz86T6570q-tmT5GeMKclr-84BX-FI.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-nAYOsOGQk4GN6n4I0RUZJ7UHy779RL88namHkiE7w6g.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-eh6ddPWEJIs8PlBiy_8sTUztJ-41WrriOMpAneOYRWk.jar
    Aug 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nrG-Epzwz24G-chsZlnkA5ByIN1qRIuvNmxVoBMuRcw.jar
    Aug 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-q6XyMX5i1jJvaIO4rSpAp-yyST8p5bjCUSzM4f6LrAo.jar
    Aug 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-hR28HrehwoCIk_JbZ3GHtEy_IeDegGg6UPDonpgsFOs.jar
    Aug 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-dbAYgxvHR3fk3KD9Ad6-ZuKblbM3JouCnaEAUF0iQ8Y.jar
    Aug 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash ecb20210dfcde205fe373d81c1b56c0450543a65f38725e0b6a5753080c2e366> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7LICEN_N4gX-Nz2BwbVsBFBUOmXzhyXgtqV1MIDC42Y.pb
    Aug 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 22, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-22_05_45_18-3309960268890886185?project=apache-beam-testing
    Aug 22, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-22_05_45_18-3309960268890886185
    Aug 22, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-22_05_45_18-3309960268890886185
    Aug 22, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T12:45:18.499Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:25.844Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.632Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.666Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.706Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.787Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.819Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.853Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:26.890Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:27.534Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:27.613Z: Starting 5 workers in us-central1-f...
    Aug 22, 2020 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T12:45:47.641Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:45:55.735Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2020 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:46:27.349Z: Workers have started successfully.
    Aug 22, 2020 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:46:27.379Z: Workers have started successfully.
    Aug 22, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:47:10.396Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:47:10.559Z: Cleaning up.
    Aug 22, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:47:10.656Z: Stopping worker pool...
    Aug 22, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:47:52.995Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T12:47:53.058Z: Worker pool stopped.
    Aug 22, 2020 12:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-22_05_45_18-3309960268890886185 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 86932def-0272-4e62-8450-6efb831b40c9 and timestamp: 2020-08-22T12:48:02.175000000Z:
                     Metric:                    Value:
                   read_time                     19.21
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 12:48:02 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 56.612 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/qwbjoyge5l3ti

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #901

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/901/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9918] Adding tests and documentation to xlang components (#12667)


------------------------------------------
[...truncated 295.53 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aLxeOtMlWbs5L35tMHLesWKnJJu1VFZMO9dm7V8E6ig.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-G6VaQTqrAbCaGDi7HIyBNe2U2lJOMcNlIT7V-HmQEWQ.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7806331387529705485.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-P974dmP9b4tQtQGgGnlhJaH4iIgSyPHqXfsCPO-EbaU.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-_wTcfil0spQWa68xh4cGjO7PIlnQm1hkY39gfEVy0dc.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Af5QPq0xTAhS5J9IvBQBauxNE1SOH8HWV55ArZT7miE.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-c5rc4k95pbdRdWSDFfjqt6w06XsSCirDJMaAorKgmec.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aLxeOtMlWbs5L35tMHLesWKnJJu1VFZMO9dm7V8E6ig.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-DKSuJHQhT_vVvFbaFF3k5HKDcSoxnY21djOhj2IC0s8.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-XyM8Gg_Fa3jUAkQ6U3lXZ_tXZCKOJQmAzUq-rfVNjjg.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-1VvXyFK5PL1aJtQ6viyyNngDD31sUXhQumqkYZR8jow.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-b4zoz8Ogy-XEPsdGmaby6xi8XeSCXTIeuKdMTiIgyF4.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-PECtuulq-9ZFkPEpqqVWAcG8fmH6MzPJwvKkHf85izQ.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests--4FsTpwgluGwV6Kbn-iu05jxvJ4qVliMV57y3uvx--0.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-h78xKBb6favRQMo1-DNoTaABFKL--uBpHi9ET0KZ-6E.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Bic6UQB-OWPJgwlQz1-7-02qzGOgMv1Kyy2OEJIXIhk.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-3jZ-Ofi8oeNN0TQR4a6I5NT8k4HT-M7z40MwexonppY.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-_FXeN_UvI49MUsWYyErB8CbvhsYrCXUTDoCmhhYn5GU.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-manPpKHhjxalcxYj5Gwp5lABOipPi2E55YXcr2wj4UQ.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-8jz3bxqmaI1-qP0zHD0SripwPGK4yQmZF7F12IbpZvQ.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-rWWcDEDWt13ocstJ9Jf14QI-FJop60hlz7_qn4wul_Y.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-zEpJ4uJzMWSCCJva3AQ1LMrlMh6wr4RaI1KOm9i9yZg.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-d3lg9ruL9jv_UkJyspotpaSWSk_6y28Xz7WQCZTa9cM.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-1lo8ZSl8k70W4V9wX2X1_f61gfJN2tWLaGICdU1EPL4.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-00G7sDl3P7CC1kMnnMzVtETZIqXPAgECAxHDVASlswo.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-7Xtxa_4YoByqCA8zVVXp88Cfz8uVDlDFfWEXNREldNE.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-EGDOFNCvNuDVg_LG_XKGxD42IgyN5RTHYDchKkblNDY.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Kdl1MlhJI_7fV_2chQLOUy4ATyUsrJ3jstyCGZX4hy0.jar
    Aug 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-vZOuJTSJjZqssXv0HOaCgw90o5EmJOTbuJUd1A770UY.jar
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-NTHPXXB7SgKAzpabtJDGOxSS3JPh54AG5boaCjGJwrU.jar
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-P5utFFU_z0kfjpbjzV3DVdqavGi_IZUOc1Y10TsJ8to.jar
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-dy9PiJde15MzAxOw_Ln5uMTy-S3ZkMukliL5HXjdPOE.jar
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 98bfdc8a47320705d1fac304d02376b7c0a81bf84f28845119bde7fc06711107> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mL_cikcyBwXR-sME0CN2t8CoG_hPKIRRGb3n_AZxEQc.pb
    Aug 22, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-21_23_45_37-1086136998604518568?project=apache-beam-testing
    Aug 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-21_23_45_37-1086136998604518568
    Aug 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-21_23_45_37-1086136998604518568
    Aug 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T06:45:37.396Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:45.049Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:45.795Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:45.845Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:45.884Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:45.953Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:45.993Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:46.032Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:46.064Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:46.783Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:45:46.867Z: Starting 5 workers in us-central1-f...
    Aug 22, 2020 6:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T06:46:05.854Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:46:38.427Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:46:48.961Z: Workers have started successfully.
    Aug 22, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:46:48.988Z: Workers have started successfully.
    Aug 22, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:47:20.820Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:47:21.123Z: Cleaning up.
    Aug 22, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:47:21.211Z: Stopping worker pool...
    Aug 22, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:48:13.252Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T06:48:13.360Z: Worker pool stopped.
    Aug 22, 2020 6:48:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-21_23_45_37-1086136998604518568 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 700accdf-f4aa-43d0-9839-2caad2d7c1c9 and timestamp: 2020-08-22T06:48:21.179000000Z:
                     Metric:                    Value:
                   read_time                    11.687
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 6:48:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 57.096 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
106 actionable tasks: 74 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/i7het2hf6etf6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #900

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/900/display/redirect?page=changes>

Changes:

[Robin Qiu] Fix bug in JoinScanWithRefConverter

[ningk] [BEAM-10771] Added Whitespacelint job to pull request template

[noreply] Use setCoder, not setSchema (#12662)

[noreply] [BEAM-10703] Prepare Dataflow Java runner for shardable states (#12578)

[noreply] [BEAM-10549] Improve runtime type checking performance for the Python

[Kyle Weaver] [BEAM-10460] Remove SparkPortableExecutionTest altogether.

[noreply] [BEAM-10788] Also disable immutability checking for transforms within

[noreply] [BEAM-5715, BEAM-8862, BEAM-8702] Removes all references to grpc_all and

[noreply] [BEAM-10761] Prefer TestPubsub#assertSubscriptionEventuallyExists when

[noreply] [BEAM-10654] Implemented ExternalSchemaIOTransformRegistrar for jdbc

[noreply] Merge pull request #12580 from [BEAM-2855] nexmark python suite


------------------------------------------
[...truncated 301.39 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-nhk9AG8o82Vrbrx9rJYARJIyWDYovqg7wQAGAny9HkI.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-dgdjU83MjAoVbyQbCiUzVKQpwO5thM9yRZUBMg_J2LU.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-PImQZYvx3TwDkUZTWVTap6zTXsUfgZgUZhe_WZz_Ne8.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-7dk0K6GHQ1uZ6zzfbPjEC5ixo1FIbsUXx1pC_ZCXqVs.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mEthrbkQ0zzuiIpUmhtag-Gm0YGjT7ZX6dP62LIfN6g.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-yyb3e4PIYHQgycWOoS_Yb-jLVSYgRON-wXyJ-rzcWMA.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-0xGm-x4hY2fhtNsRPRKaFfD35k3em2isvyNL0WsV9EI.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-1Uz9Qdy9_0J8uMacZ-NpvRxZSvyZv-lK90VSehqTy_Y.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-sdCn4BLeY8Yu4SBKhhpBLE2ITzVCGQGz17uhndOIxIo.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EWJTi6XsR7uqZ6BVuv7r9iYyqLujQyEwQJ7LY1EVzyo.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--RwwWtRiBsNOc6rw9w5IT8VVPrSF-I1SXH5gVxumOf8.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-HYb1foT6HCDycuHWsvKTvU4Y-o1EK2wdN5DgT7LOXIQ.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-94Qe5eiO21TxhVgWVnWpiSf4E9kDAq1nfW_y7kVy2q0.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-c3jbWy3kGpJgl5fNziaWLv_dPb5Pw6t8IUfuOH47TWY.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-fB9o4AbJSoEgDFY3j6gIMW2rSfTObjGdrfRgpupWt-s.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-D-JN8Gp6CZjpzxmh5VQTpF-aSyFwg-Cu_QyNB9x9Z4k.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-6_KbujN-OTK7YQlLMWd_QcGZBkVK3Vz4jZdEV0kXDbc.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-mvMshnHiuIFI-NJgqiUtSxnqhVwa9h82VyViX7CAbps.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-fNQ9Kuw9sGHQXI8JGHRlIBZCvIdz10nrbth9fPkGEFw.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-qnO8v7DO7gEO0VthXRNcog9ayHqqvXLYaYY2SxKP8EE.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-yZrmFmQ2RhRbvZTXj1PXKGlWpdshvQQchxcF85KhXX0.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-jNZ-tVa9VnkjC-kqx2us7-oXdEcF1eVpHIFpDLbYBk8.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-h3vqCTzGBJ20Mgkw-jkGZ9NQ6RrP3ET3RzmBs-xsiMI.jar
    Aug 22, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--RwwWtRiBsNOc6rw9w5IT8VVPrSF-I1SXH5gVxumOf8.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-vIQcTxmF9_0oSwqIEDve-2KiRFqLn4sX3kU6O2zztP0.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1176980637154092454.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KJnIDW-fYBCG7lG9spkqQEGVhB6y8CInlRCWXxq8JKc.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-6ayLuEd0CNlF3OOwb34SQfgWeZYKdVeeFG6xTidsT3w.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Qp5afpj9YTwLyboyn2OrvUJ-agDaeYbh1V2OgDrvuIM.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-ygI0OpEizPHrqlw27k2hBbUq-RmMMtz_3hMLEUBhGBk.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-_vGIsfU5Aq2XvCKPSmi5DsqGMSsFEH4TneMWUwpVhiU.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-0zA5VqMAmk6QQ-9aAqRojUze9XqQ-pGCzA9ezal9AHk.jar
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Aug 22, 2020 12:46:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2020 12:46:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2020 12:46:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2020 12:46:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2020 12:46:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92128 bytes, hash 276448ee2eab52e634f84b84cf0207c1c49843dcf2e58d20bf039136cded9066> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-J2RI7i6rUuY0-EuEzwIHwcSYQ9zy5Y0gvwORNs3tkGY.pb
    Aug 22, 2020 12:46:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 22, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-21_17_46_06-9482108558882284215?project=apache-beam-testing
    Aug 22, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-21_17_46_06-9482108558882284215
    Aug 22, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-21_17_46_06-9482108558882284215
    Aug 22, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T00:46:06.637Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:14.336Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.123Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.167Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.203Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.282Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.308Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.336Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.358Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.840Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:15.922Z: Starting 5 workers in us-central1-f...
    Aug 22, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:44.593Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 22, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:44.623Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 22, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-22T00:46:48.414Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:46:49.905Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:47:11.241Z: Workers have started successfully.
    Aug 22, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:47:11.381Z: Workers have started successfully.
    Aug 22, 2020 12:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:47:48.637Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2020 12:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:47:48.812Z: Cleaning up.
    Aug 22, 2020 12:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:47:48.888Z: Stopping worker pool...
    Aug 22, 2020 12:48:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:48:39.189Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2020 12:48:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-22T00:48:39.233Z: Worker pool stopped.
    Aug 22, 2020 12:48:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-21_17_46_06-9482108558882284215 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 226efe94-5260-42cb-9c32-4e21f5de5b44 and timestamp: 2020-08-22T00:48:46.701000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.666

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2020 12:48:47 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 55.063 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 28s
106 actionable tasks: 79 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/k4szqszlcexoa

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #899

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/899/display/redirect?page=changes>

Changes:

[Udi Meiri] Color Jenkins logs support

[ningk] Added a whitespace lint precommit job.

[ningk] Removed trailing whitespaces from all markdown and gradle build files.

[ningk] Updated jenkins precommit job README, added comments of what needs to be

[ningk] Removed trailing whitespaces from HEAD changes

[Alan Myrvold] [BEAM-10049] Add licenses for go dependencies in python, java, and go

[noreply] Propagate BigQuery streaming insert throttled time to Dataflow worker in

[tysonjh] [BEAM-10751] Revert "Extending archiveJunit post-commit task with

[noreply] [BEAM-10764] Make is_in_ipython robust (#12641)

[noreply] [BEAM-10766] Fix flake where ByteString coercion could modify cached


------------------------------------------
[...truncated 296.07 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 6:45:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2020 6:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-cqUJKWPIHEcn5P7-AbGiTEG4sMPv8blN-Jj2pHWtkOQ.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-7YwCi5fP7EZdRdFP5VmGpGak-zzC7CIW2Eg3iiUCAIw.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-JQupqQDqqlx-mUBhVkOvjdqGTGy4SNgxWqt8IdORjXM.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-zwbieI0lxHEnbRoswT4o4bsoqIPIKdCoI0TNeW9Ci-0.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-ck33hwgVu9vE63stMmULLnOTBBqsg9pz01BW5AEw2xY.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-oMMkryVrgIYrnXtesilTEZa-k5i78XU6rNP9LPaLHT4.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-gKu2m3Vyqjibd-UHFHdLZvPfjMK97jxYeYmtFlKaiVA.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-blMX7hGih-168H7_1YNXE1xt4D3F7Kan60AyIeuBxe4.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-HiQ8dVKfU1GodWY3MaszzTfNIZvHRbO27GXP7O_3Udc.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RKFSy4PxzWagkF-_GKFirWUaZZHE4AgxGG9VQKvNYvc.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-nslVkoqbLRLGvmo4QZcyv6_eJEJMreuTZd_hK7dJ_Fw.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mOSPSYGsJoz0kvpay-eaD8gahDKXCy0LZ9uRmSDbUdg.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-J3XoWVbJC_Yk7DGn2ftie_dpKyPF6G9AZvos1DvD9j4.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-9qRZquTyHAP454KFOKDrmwaZr6pBzMWbkNteXRETQig.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-y4zUfdFqXpBLxJ6ooMgIPjJUm3IiQVMqVHJWrsUr-xY.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-6MnZz4b6wQWFgtmlhvz56SsGtsYYLpOSkWFsNEloW0M.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-QyJxjCM5ZBkyw-umXMeo-6R4tdXz9dNMn3q0FYREv6k.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-m8NSZepGJt1Et7YKLttOFyb400W-l2celaX2KXHah4U.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-wk215zkROeInklES8EPsptCJY95AZFRjW7G6mfZ5xsk.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4420437358518849286.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-CbkC1u1JkQsu4aA6IiRgpxyAQ78LgwzhOpcs11k1ZLw.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-FKt4T1fGKnmNhmtp_bPagdRRBjHC3XbCe6P18eLx9hI.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-cqUJKWPIHEcn5P7-AbGiTEG4sMPv8blN-Jj2pHWtkOQ.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-uTm8yM53apRG30mxha45TdKY5whrlP52IC0U1Pa4sGA.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-7cjyrB90jUub25NnWFeEdu4gDQ-4XlEculRWjVMOsYk.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-HTIVXvQ6T_Gy0DSpXbwuEDJTflIWJZ6l_pFPLLKbGig.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-aGxB9DFfGiY2r2qNrzwTgra9gi5HFGJwEm0sMWXStPA.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ie_QpKvPX8NVPEofuhOGHmlZEpEY0KJmjExjLQAcHdI.jar
    Aug 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3o1c0vCQ6P8ttKz2Zp4gu3bHHZyweq3ipYSDdd5s35k.jar
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-wv07fkU-tX8c1WZcNOJtKeeTAKdJ9zCvQBj6XP3-ZSg.jar
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-hoYT57Ea_FpO5x2I4_dUY2KO8Y75CueA8V_Rb7q6Edw.jar
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-yzwStGh-CjdniZNim73x1sxRFk-ZOin3lK9S5s3WhBo.jar
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 0 seconds
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash c321a428d1c8aab05015dac7184cd65723f962c95f175535a34f48e5452e2907> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wyGkKNHIqrBQFdrHGEzWVyP5YslfF1U1o09I5UUuKQc.pb
    Aug 21, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-21_11_45_41-5291043341680199938?project=apache-beam-testing
    Aug 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-21_11_45_41-5291043341680199938
    Aug 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-21_11_45_41-5291043341680199938
    Aug 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T18:45:41.966Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:49.322Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.026Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.067Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.098Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.169Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.201Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.241Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.277Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.638Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:45:50.721Z: Starting 5 workers in us-central1-a...
    Aug 21, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T18:46:05.180Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:46:20.245Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:46:20.279Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 21, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:46:25.670Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:46:39.604Z: Workers have started successfully.
    Aug 21, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:46:39.638Z: Workers have started successfully.
    Aug 21, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:47:11.732Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:47:11.887Z: Cleaning up.
    Aug 21, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:47:11.956Z: Stopping worker pool...
    Aug 21, 2020 6:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:47:58.989Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2020 6:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T18:47:59.041Z: Worker pool stopped.
    Aug 21, 2020 6:48:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-21_11_45_41-5291043341680199938 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b255dde1-2adf-4b6b-b82a-1c2dcfe1ba59 and timestamp: 2020-08-21T18:48:06.948000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.404

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 6:48:07 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 39.194 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/4bf4hj5to3eom

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #898

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/898/display/redirect>

Changes:


------------------------------------------
[...truncated 295.37 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 21, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Rsbjk_XT_aS0Dx5TWb6LgSpIR3CHcLyjxpMxEgIdHAI.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-y5lLRpTrTn2XbqUycVQUBfB6anIIR9sV_HpkAAfRT5A.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-yze0A14yyVPiwlZXa_yPMZJnEh01N2_oxdEb5UteMzs.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-ftjHXCuHyXqN8m58YwJbHrY8kUGAmMCVzdvKJFzeLho.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-uhS04Ju1NXOaeQvelTXM41FmvicDPTcJmpfR018RBdU.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-9S04RGZIHq3q3b_MLZFQ0ozmJWta6iTg4AB65SbGamY.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-T5kDHOIxgxOnoMGhnvQtSoAMcpzBG3dFAaHkQ1GOnjY.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-_KnsFfMSchhuiAYehP121Ph8sPnKonJMLBQCexRUdTw.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-b5JHY8AwYi_hSsWdI1y7wBlWryHm9L1T_AGfIS0nME8.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-s_owFcIRlBquKO3LsSWBLC52HspzPxweD8fcplFD8mc.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Rsbjk_XT_aS0Dx5TWb6LgSpIR3CHcLyjxpMxEgIdHAI.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-xCWmZqHFK2C3fPT7ZeoMaipoKdYT_k9tdtBxSyE7k4k.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-SKw7sar5RV2GoLZLT0tYgcb2S66iCBdJhT90LTwPIeA.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-MTeTIB1UvMj0wa2QoWGBhbKzlSYtDw8GHcHN2R6Ngzc.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-w7DLz066Fz_hmLtiwyvHwYxjU-elt-8w71sGE7D90UU.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-oAeF8m1ApinozCvc9yzIz9LBEWrkHKaZIAlK7q3uAAA.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-RJZTN76nPFi-1rZV_1vF7a8fEc28B1Brk4e6b9QT4Wc.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-QVvbRYDNvxjNDJeyNAP2ODVU-z4MRTSgOE9eBmumlrk.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-WaZsG2Tk7V3xrllWbQBdjvrxvi_gWiO_8xICOQDWqAA.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-pFBgBzJE80rvUIYe1yzK7tTc5rJ29ngff3DL26dqJqA.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-OZQUEzpikjeNUF9erukDLeiTIY6Hrf2HOgHfJaGOKtM.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT--ldmo9jzcg6WTxbtE4tBdCc03QQKLb0mUakQNF1YSmc.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1908029433377079819.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RAKU_oeJwfXgPJ7wnfk410hIgToGinz_U191IrjJT9A.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ptnAOiM9reaSCFprJdMT7Tc6-9Tdc9ducJ3Qrinbykc.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-3JGE21xurkPtiFzp4uNc2NEj03S1bJNFFErWuYcr74o.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-mtNzZUfZoHa1DQgcNW0doawmrZd5BfzdU7OnHQE395w.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-39LmHFYBzjQrfwYbulNsXzuj8gJVXWvH-QY_v5UQRlU.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-m3jbTxFmWWgna1dBCWqOm9Ej9RHN3ScnXoEyOt-Wir8.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-mdGRzJU9hpl942YIMsVVJFcsmULwmMAtxpqW2QQO2Mk.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-kceVtsJpVpE_jWWBBb0vG8PrV6ahCo7LZqdJw4ZBaJU.jar
    Aug 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT--iHpgsHULj18Xnc-YQr5PlejCyQdZVcYgSJ0t276lC8.jar
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash a4fa21394a09801ced9a1b29b3c168aab30b988cb272f311c8d20790506f51df> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pPohOUoJgBztmhsps8FoqrMLmIyycvMRyNIHkFBvUd8.pb
    Aug 21, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 21, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-21_05_45_28-6049103799524672446?project=apache-beam-testing
    Aug 21, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-21_05_45_28-6049103799524672446
    Aug 21, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-21_05_45_28-6049103799524672446
    Aug 21, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T12:45:28.282Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:00.421Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.089Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.127Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.163Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.224Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.260Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.294Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.321Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.797Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:01.865Z: Starting 5 workers in us-central1-f...
    Aug 21, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T12:46:07.926Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:28.088Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:28.178Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 21, 2020 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:33.488Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:49.495Z: Workers have started successfully.
    Aug 21, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:46:49.525Z: Workers have started successfully.
    Aug 21, 2020 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:47:22.612Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:47:22.775Z: Cleaning up.
    Aug 21, 2020 12:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:47:22.853Z: Stopping worker pool...
    Aug 21, 2020 12:48:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:48:16.423Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2020 12:48:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T12:48:16.469Z: Worker pool stopped.
    Aug 21, 2020 12:48:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-21_05_45_28-6049103799524672446 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 57f8c17a-2ddd-427f-9af7-f8346af0b486 and timestamp: 2020-08-21T12:48:24.029000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.188

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 12:48:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 14.503 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/u5pj5xfjnge3y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #897

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/897/display/redirect>

Changes:


------------------------------------------
[...truncated 293.60 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 21, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dKjPcv5WgJLeeP_r29Pxel2YuFlOw_SujDtkSfofq0I.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-QZc-q1hBlx96c_HXDFVzSAFTud6QKsRNtMoKZDRTxLo.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-b5bXVSEklVW2EJgpxfK5TFvSmdHaY20XceKDkQZHI04.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-bIvF7BaFzWm_VVKxI1vyIKSklHAhT1a8b40TcW_plvw.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RS9TVhGVAuEoh4Xy1RbCDZKFQ3UqOQ8nQkFZlnIjMzs.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-KvT4TW_UsWPiaP3QJKTciVVE-RAMWwjH5r5reuf0fx4.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-J_TGnuRx_4CjQzHZdkZJMwH28vrhb7Q-ucebVHZonnw.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-fsE7h5ZOrX7y068jWIZxHV9LHwTR6S28Xw_DF4nEKJk.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dKjPcv5WgJLeeP_r29Pxel2YuFlOw_SujDtkSfofq0I.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-AfE76LDbcOSKnuiHCyZGSMRVilGRQFdbP3Bg0Kz5jIo.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RBfozQKglvN8t3yom3vioGcRXkPoAio5f3JLlkKOWRc.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-4o3D_uHykAKCH3FPCiUvI0y44GuAf2ZwPeO66ybO4DE.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-mqEZjkTeFDWaFikRSIWmZUQze5DkluNIWRUV5rUuQbs.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test183472549015233553.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1fpJzd-jq3P8hevjlLICsVy1RXFcp1X23VkVnhcWDBU.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Eg-JETWcqzSAGXjHak6WksLt3-r0MXchQfgoSdLZWQY.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-1xngqatxURZEXO0uAc8vekNAahHj2DdkRblG2RGzdSQ.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-2xeBKACP2f4P3eDejMHwLZwgZLKwZjpEkk4Quu3Vtfk.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Dgs4WeqIyAz4K1Tynp4g7IkZLj3vO0oOzxp329JgJAI.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-U0GSc3BrhbnkCD7Tb-1TznDNr_wt607nOwOHA1kPL4w.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-z04e49cX089_2m9GMezQkYTA55zjq7PgfTqkH1VLRcU.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Zy8IFoWXMHypobmpxAhKXy4gU47AuxOgQjFzCOzdc1M.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-25tshj8K4VQPNJFCZ5p127S1v9YaSbSNz-UYipqUPO4.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-J5HnXHDxmCoSGDYnx_hVGgzHIjF2wj7QQiuVG4rshYA.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-v71HwxLQduzFLFnGg6NAWnjLVVWetRTufcp_1eyOMJU.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-n9vV-3-XygFdwVg2Mgr31TE2czzQvq9yFNgDjaWw65E.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-uxheailtwQBO5PYMz8EQC-AFMJQZEeSXS3LbgzxSW7o.jar
    Aug 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-MkBAvLek3IQ-VDznW_vPo5bUmG9P4JIKf4vtnx_Dq-0.jar
    Aug 21, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XQueelFH4DaHin_49qNtve_LascGsJW-nawJZ3vMAeI.jar
    Aug 21, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-0UxRQVPplE-gDzIPh17LDaLafXREWwk-6SOsataBBt0.jar
    Aug 21, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-RQsGTLBVnUAMvNrxWwkms79T9LfTHwDU3KDcWqdmSP8.jar
    Aug 21, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-_rJ9cFtoLzFUlDsCNy1PEUdzpJhdhpP23ghWBLga4fI.jar
    Aug 21, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 203d4cbd4616de6322820f7c21c0d3f817d61d631b372863e909c4a884b96073> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ID1MvUYW3mMigg98IcDT-BfWHWMbNyhj6QnEqIS5YHM.pb
    Aug 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-20_23_45_18-9740817880041509812?project=apache-beam-testing
    Aug 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-20_23_45_18-9740817880041509812
    Aug 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-20_23_45_18-9740817880041509812
    Aug 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T06:45:18.607Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:27.258Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.135Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.179Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.216Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.281Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.307Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.336Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.362Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.682Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:28.754Z: Starting 5 workers in us-central1-f...
    Aug 21, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T06:45:33.931Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:56.287Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:45:56.312Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 21, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:46:01.560Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:46:23.838Z: Workers have started successfully.
    Aug 21, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:46:23.920Z: Workers have started successfully.
    Aug 21, 2020 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:47:02.435Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:47:02.645Z: Cleaning up.
    Aug 21, 2020 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:47:02.731Z: Stopping worker pool...
    Aug 21, 2020 6:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:47:51.259Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2020 6:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:47:51.356Z: Worker pool stopped.
    Aug 21, 2020 6:47:59 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-20_23_45_18-9740817880041509812 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dcc0dd46-0349-4164-8707-d6ec141eb0a6 and timestamp: 2020-08-21T06:47:59.734000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     17.52

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 6:48:00 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 55.426 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/gz5p7rgdqez5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #896

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/896/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-3301] Consolidating starcgen method names in graph/fn.go and

[daniel.o.programmer] [BEAM-10289] Fixing miscalculated SDF split fraction.

[noreply] [BEAM-10701] Generate Codecov XML (#12566)

[noreply] [BEAM-9979] Fix read index being reported to be reset after any metrics

[noreply] [BEAM-9546] DataframeTransform can now consume a schema-aware


------------------------------------------
[...truncated 293.40 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 21, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UnUrtc4uwdC7noEfR-ChNV6ga5GJbYLwzdEaqfrdnbU.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-587H-CG8d-Z-mCLw0QYzWrKbcwcstBYDyes3ukCgOUI.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-9703sT_YksPQ3D5JzeFobyqxGObrkmklBw9mA3ny5dI.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-oVnGMM1WxSGi1h8OyvNbTb7es7uBpTT0V8YWaOWWIn4.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-_W0JOzk_5ji39pO79lrvTgMVuF5lkBZf5VFo-YZ0Cqk.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-CKaBPUVen4ZORQF9I-BlIuHsRfvxnrliEfJWarLdu7o.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test175366128853778220.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-F33sirfWZRD9S3iZkJ9TdaEEKDXfl9QN65hCCwfGDnA.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Wdhq72GMwceGD1GKoJ_rRpy3J_tGvcwPzH3BZ20Zf4s.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-EaDl_uZR7Mkkcxozs0J2o1-AD7XetmuHaG-xKxZHupk.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DnIJznz7MZdAPrSm8129euLWRaS1ocGg-02zDJ-mNJE.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UnUrtc4uwdC7noEfR-ChNV6ga5GJbYLwzdEaqfrdnbU.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-YXav3Ff6CAkaQGcJcGarwOKWOKy-Kfy5LkCutnHr73s.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-byfG26KerZZFECxGm7DBQnV0IRpkCthUoukT9xpTpCw.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xqDPSg5KkG0_vAz1sMWG1ERA262C7ARZjh_z5js7urw.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-n2xuAmxRP7yBrAR2WXxxuFJUBDpP2-IrCXwDqZzMnSU.jar
    Aug 21, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-U7lNBaHetULA4KhVjbaNwa0NuMlBawJpvI3WMXWe3hM.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-1RbQtn7zQRTQ2Cnl-i6AgCFSpBeRoh2RUhXsUc5hIHI.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-YPqTZ34YU4jcKusOyhOx6lGYBvufA8-uL4t_ZwWiFsM.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-mfcccmE2XjpKNDAwWcZ-Jpw7qn4HeCWgzst2XcRCLSU.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uqKT4_TtapeqyoFdhcDq0Bg-erpCozLZqg2R7fZ3Drg.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-LMHpbwJ38gTrzAc-S_dwrUrcnMY_zu3RJO9SPI2XcdI.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-ehdBIb_F5JxDw8qBxO-gIBmcWwav5I_HmIDoXfoTFZ4.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-aWfW_oKY49FSnCWiSJaD60qFonDX-gUh6litNPBnXfQ.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-4tZK26r_3R7SIx9cfoTHGcoh-orRn9teI1yJZOhsd0M.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-n6kbkgF3qGAiATgZcCIew2-ywnR6lAro86KbafkXkNg.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-HwMnuBIpju-i55AfThy6SWm2PpHgyVOCD8O5jZGqO8Y.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-yeuCaJG-K2FUPJ8605-uz9xHaySHY_E4ctS2cboMQQc.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-hQZHUrK34lTdqTMldV5RxKs828_9KQLV0x7MbmfUnAQ.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-BmP4e-unLljNOjx1a7OxRgO8VCMhBzv--38oBFJN-f0.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-KvhVKBDSSO2Fy7lNUTA8PslLkwt2VeEvztzMyvkMTT0.jar
    Aug 21, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-RBhWNQqBrq8U44BLRS8G1Od_-G1prAB8PGwSUymrKz0.jar
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash 664bb1fa973aed213ad4f9650e8d2fa55e13e2d65674466382c0b481ca9ebb9e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Zkux-pc67SE61PllDo0vpV4T4tZWdEZjgsC0gcqeu54.pb
    Aug 21, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 21, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-20_17_45_22-9827967703029346502?project=apache-beam-testing
    Aug 21, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-20_17_45_22-9827967703029346502
    Aug 21, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-20_17_45_22-9827967703029346502
    Aug 21, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T00:45:22.870Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:30.865Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:31.842Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:31.886Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:31.920Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:31.990Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:32.036Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:32.083Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:32.129Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:32.556Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:32.638Z: Starting 5 workers in us-central1-a...
    Aug 21, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T00:45:37.410Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:59.199Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:45:59.238Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 21, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:46:04.583Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:46:19.993Z: Workers have started successfully.
    Aug 21, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:46:20.027Z: Workers have started successfully.
    Aug 21, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:46:52.592Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:46:52.738Z: Cleaning up.
    Aug 21, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:46:52.816Z: Stopping worker pool...
    Aug 21, 2020 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:47:56.309Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2020 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T00:47:56.395Z: Worker pool stopped.
    Aug 21, 2020 12:48:05 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-20_17_45_22-9827967703029346502 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b0f1adc1-172f-47dc-a92d-dbd8c8ff0319 and timestamp: 2020-08-21T00:48:05.913000000Z:
                     Metric:                    Value:
                   read_time                     12.76
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2020 12:48:06 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 57.712 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/hmruwux3t66zs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #895

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/895/display/redirect?page=changes>

Changes:

[david.janicek] [BEAM-10292] LocalFileSystem.matchNewResource method throws

[Kamil Wasilewski] [BEAM-10674] Python CoGBK: Remove unnecessary step. Adjust parameters to

[Kamil Wasilewski] [BEAM-10674] Python CoGBK: Add streaming job

[Kyle Weaver] [BEAM-9118] Increase portable_runner_test timeout.

[Kyle Weaver] [BEAM-10770] Remove DataflowPortabilityApiUnsupported annotation.

[david.janicek] [BEAM-10292] changes after review


------------------------------------------
[...truncated 298.56 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:46:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 6:46:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2020 6:46:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2020 6:46:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0YD03cFpXE6NvAHiqmgRSgCb4vvPX7ptgfFf4Yu9Dd0.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-JClUMYa2jc0L-5TCH5dY69ikQg7-UhQCqFBA80mI7h8.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-CXLQ7nCuFvJpzdbzQFeKkd08InIMZTTjpPVEFo5suB0.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-gAdPc7xN_08yQh8PZYV3lh5DG8ntrVAWW21GxTw_RSg.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-HOYO4cz86Echr728s-A1ORWb9aYB2Bu7EWXDJm8qrQQ.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test340611041652656147.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ztAdhJ1qhL_V76f-86AxmhRGfZfDlvUOM8IAgvqCZKc.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-PS_3qReA43gaAnHAbNsAMqe9maFB6dnFkZZfaF6vQ2c.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-4Xz_XB4DGIV8N8FTI9EjPDRKxjWQLe2Gvs79VQ8Z0AY.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-F528eVO8rspbj7QIe9bDAhKumSoJkA8m3MC_YO87YF8.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-T4uDPXzZBF_LovE4IdJWd5rm9QSKyr946rT77GPUpRs.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-bmgUB9Z5dUolqBkNUkGtowfy331ZLgOXeuKqBzz3-RY.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ZzXMKkyOLZ8wEhlLPnSV6vI28NB_IRtZgzMwVDLM8Ec.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KnvKgvPqoPZ73dh8Rl4VharwgkJqE5rz2JXdOQrog9A.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-yoeibe2UEYXp2QxW6UQnTGmPfxlgQQ24vV45Rczppws.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-4Xz_XB4DGIV8N8FTI9EjPDRKxjWQLe2Gvs79VQ8Z0AY.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Onf2HzWgzj7LTSLqxF0ZDqqgSoI4vJ-8Opm7786K4xs.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-12Td0Qn0GgZEsUvDN-MMp2CwVJgN8s_4ehJ9QAOtp_A.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-98bUrdcxo_j55Ah2rJqZS-ZAgAkrhEgO9WlYjGfkiKI.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-sI116NHKWXo32gBap0UaoOJdERFjKzkBnt8ysP7KKC0.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Nf9iEQd0ba2IiaMl-_AmVtgU4nu-akgIIvFQuvsniSU.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xArGRybkrvqCGI_yzMGhvjDC1Dro0oMuDk9mvwBbyd4.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-EbJ_1JFv-e6nV4C0D1OF8TvcWvXZ02fJk67MPM8xT9E.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-dsMiCCacHIXp-8Jd0EkPoy61B_8JPB4zKsyb-MaCVoE.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-UKquFlFXGurnSxAZ3kzZUf2Px74v0RFeReC8UtfZ6Og.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-hxpg5WUA_kUfy-NMaUEMaEfRJjLhyyflLkOs8Smvj2U.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-I36HR31emmwPu8C13kd0_Bu_KdH94kZZlSX1OJtpnZo.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-M_wMNo1KcFB7OpdUb649ZEFFoKICh83N45JxPMdGMBk.jar
    Aug 20, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QgmGlrDQjxlN1fyFU8yLrKbL9LER5eO35vq6dkWRFC4.jar
    Aug 20, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-PazNa0FSpWbxllfh2l6bsDnmzOZgWtjjkwW4Z3bmuPw.jar
    Aug 20, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-5zmJniWjDPZxbVZvbct8jcYpecPNaLbAyLc2sIvrKac.jar
    Aug 20, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-nbRow7mCyIAUwyaBuMQa4xzWPGgnjgFKooIgo2mKDXU.jar
    Aug 20, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 20, 2020 6:47:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2020 6:47:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2020 6:47:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2020 6:47:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2020 6:47:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 26466a814f117f847165939770bcd7eb9e9632bcc4e34d4166ed61f486c84d10> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JkZqgU8Rf4RxZZOXcLzX656WMrzE401BZu1h9IbITRA.pb
    Aug 20, 2020 6:47:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 20, 2020 6:47:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-20_11_47_02-2295538111671242774?project=apache-beam-testing
    Aug 20, 2020 6:47:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-20_11_47_02-2295538111671242774
    Aug 20, 2020 6:47:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-20_11_47_02-2295538111671242774
    Aug 20, 2020 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T18:47:02.567Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:16.792Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:17.531Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:17.707Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:17.736Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:17.805Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:17.836Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:17.883Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:18.043Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:19.858Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:20.242Z: Starting 5 workers in us-central1-b...
    Aug 20, 2020 6:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T18:47:41.293Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:47:43.466Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:48:08.465Z: Workers have started successfully.
    Aug 20, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:48:08.513Z: Workers have started successfully.
    Aug 20, 2020 6:48:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:48:45.856Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2020 6:48:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:48:46.013Z: Cleaning up.
    Aug 20, 2020 6:48:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:48:46.142Z: Stopping worker pool...
    Aug 20, 2020 6:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:49:30.372Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2020 6:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T18:49:30.469Z: Worker pool stopped.
    Aug 20, 2020 6:49:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-20_11_47_02-2295538111671242774 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cca55f7e-07a4-4a02-a7ae-3ffd4b17a347 and timestamp: 2020-08-20T18:49:40.322000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.426

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 6:49:40 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 52.507 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 11s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/s52lflxvkyqd4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #894

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/894/display/redirect>

Changes:


------------------------------------------
[...truncated 294.00 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-1crx7Utx70zlJN8Sosfd-wZEQDvyM1swbL9q74rUDhI.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-yILeLXLDh__QV8Hyntiajjavx66Zx4TGewTF1JQqk8Q.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-SMOa715ktzxx1vF_Buvz6gbU-3kmlrxYVl_jrBjC6xg.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-3Jf_HDOGN9vrQggiJsPSfC22PV7XkAlwSTtcGHXvlGw.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-1crx7Utx70zlJN8Sosfd-wZEQDvyM1swbL9q74rUDhI.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-2RA4HgiIf2d22lR0vI4JeXSv3yxapk20ic3ML_UJTjY.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-nBC0hq5ZVjy993LCPam1I9prUJxeesH07_TFB5Jsi9Y.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-szf0wsBYyUEIpYuBdX8GVcLMrz5_-8mT_JkDsqBcoGU.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-93Ep64_HXS9fBKifV6u_8p2TDnEsj1XwxSGEp3z9Y0k.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8333070574261265149.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-08aC22K8xvlpXjbhgfepwxmkif4KFb8uxPDnleF5f54.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-DtgVLUpJQfcyYDmDCRJqgKifcD_mVAHjhc6NVWXH5TU.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-6cI5xo0g-Y5Ndn3lNNxMhO1IjvlMFyacunQ6SLzeukc.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-zjZt_ZcLdIMEO0Sd38esw8Lp2k-BpK9rJxSGnzNuNcc.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-DGA7oJoeVMTRIG11oiGY3ohZtZYWubf04m4JLU2mv4g.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-N_uFCYNC5CwVny5jbKlh_vpD6wyd9q8A8maN5t0nX_I.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-GJtcxMnqCnLOQ9h149JpYnEb-H4G8kYfD8s04rLl1HM.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-LPIIJBq7a1HVOypR2TSlwPwP9bX5v4s97nC17wkm1zc.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ODGqaJQ73Csl7y5DeE2N6Wvr3tnWKqdRBndLWFCSqT8.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-MU3SrGHPAlGVgbfsyrec0H5uzaRa8Ac7UMx957_Yonc.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-fEEBidJHJHZpNvf6iWDEy_ziha8E65RJ0J9JwbjKNvg.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-bOBXWoqLE6y6jJEFeWsQpH49XZhT15H7veDeuBSfX90.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-t64FeVd2Dshj_yeoF55K-PAiyvF-O9il3CIcKr3kjMk.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-YJUfnoA-wHdT-YUG_fJocAOnX1EBX1qkI9W5a0412NA.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-soP21gh4ayaNCJLdKSMJGvU324PAi88P0u0jTmCWfVM.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-TmVsKE4HEeDKa2GtFy58PIVk2J7KCRtvmcGnmu2tGd4.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-_-YK7kCZ39bstG5giwW1sozm_aAOvNS_Op3Rij5FFfU.jar
    Aug 20, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-jeDQgrCrFtw4Ul0yYMqWP1OND6WywGYM-3yOyQlIwu8.jar
    Aug 20, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-TvDq172xUPfUJcZBKo19L0fUCGuelnmb5xaDAexYXts.jar
    Aug 20, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-nqYaBGl3pKM6uwqYt71yS_pyCheGfVlC_1E9EQ7ursQ.jar
    Aug 20, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-qILce8FJoOWoi80dF3AHy6YcQgLE-OTPS_VVInf5Yoc.jar
    Aug 20, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-KUl_2TkwhJS8yWtxUBrLPlDHoNOGI0hc8LucG3_h8KY.jar
    Aug 20, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 20, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 72607e62c1d4b3dc0954f49f5510a74661262d7ff205cf44a1067973cab96ce5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cmB-YsHUs9wJVPSfVRCnRmEmLX_yBc9EoQZ5c8q5bOU.pb
    Aug 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 20, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-20_05_45_20-10866094387437451941?project=apache-beam-testing
    Aug 20, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-20_05_45_20-10866094387437451941
    Aug 20, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-20_05_45_20-10866094387437451941
    Aug 20, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T12:45:20.407Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:28.492Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.252Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.295Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.329Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.396Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.428Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.463Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:29.494Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:30.196Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:30.269Z: Starting 5 workers in us-central1-f...
    Aug 20, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T12:45:44.397Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:45:59.333Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:46:19.638Z: Workers have started successfully.
    Aug 20, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:46:19.681Z: Workers have started successfully.
    Aug 20, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:46:55.264Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:46:55.412Z: Cleaning up.
    Aug 20, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:46:55.490Z: Stopping worker pool...
    Aug 20, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:47:45.890Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T12:47:45.996Z: Worker pool stopped.
    Aug 20, 2020 12:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-20_05_45_20-10866094387437451941 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 807d6a81-d4b7-407f-871b-864e6d0531c9 and timestamp: 2020-08-20T12:47:54.026000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.882

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 12:47:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 48.612 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/4bhspdrixhp5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #893

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/893/display/redirect?page=changes>

Changes:

[douglas.damon] Add Filter lesson to Go SDK katas

[douglas.damon] Reformat code for readability

[yoshiki.obata] [BEAM-9980] Configure Python versions for direct test suite tasks via

[douglas.damon] Change license info format in task-info.yaml

[douglas.damon] Update stepik course

[kevinsijo] [BEAM-9920] Enabling artifact staging for xlang transforms

[kevinsijo] documentation(beam/sdks/go): added apache license and function docstring

[kevinsijo] fix(beam/sdks/go): correcting changed function name

[noreply] Merge pull request #12581 from [BEAM-10378] Add Azure Blob Storage


------------------------------------------
[...truncated 291.97 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 20, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-P2wToHwZKZ2-FoCTL-bgk-bRHFHM3VxT5R8_PuVQyKU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-SlAkqjF6AoTQZi9ioyAfUIDwOrROEVBSvK7r6B2zJGo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-mPM03APpmINxZoM80Agx3hyeQmn2s9WHQQlx0LTjspo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5918634144078612724.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XwprdssSb_l5-tQmDRtWGCrXhDSs4_Zv1CDTkZrpK_s.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-nSAt7hFqXyS699unp0dVHFI2cZDMehWJdTnbpn0KF_A.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-C7UKSe1wEfGGxCCCJC_C4EYxOrcdS3InltUGzlVJwt4.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-jzwCq5i-tTcaYhHUww2cGwxyLyX8uj6rLRQB72leExw.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-grg356BKMmq5k5SGYtNncwpjisnw5nqz6yBVphu-rFU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-YOPaSMVujI2iC6R8gFjJ8nGfy8YMSOvxdSw4OASpgRU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-I9NuYOlhh6zKVbzm5JzaQtsmZhEJxUfI3aLYyTBbEBM.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-CEhmk1HYd_BFFk2uq4o2hqJ3Q5mGMDKMVyomrwICUvE.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-STXvNjcMwUNX3G9SHF0xOvgO8Po4Q8cGA-0RJBEabFo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-J5k1G_-rwKAvpd7MiTStS5gWuZlqWE500ipeT7dCd-Q.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QIQay9hSjon1Zzzt6EIfIBihRxmoZieottq3WaCpZtE.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-USur1pFTD4d9HlkmyHjC-IqHLFKL0uFw2z6Z2erqFD8.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-qA9eE76tomXFvNteG3KRwIKf9SSOR504_W3IxYE7uXg.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-NT-yniuPRxyYpoWVnFKDy_hUPGDA8SriMyNos4XBR88.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-L_2ssk6fZgg6AxEx0GP6c4PkeueS_zb5BTZ86e18VJ0.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-P2wToHwZKZ2-FoCTL-bgk-bRHFHM3VxT5R8_PuVQyKU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_m-xSm2yYBNKSbXC-t9BhExIw7LTuupTUWcneSDlCXk.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-q6v85PLsGfNUKAJqiXnh2yATk1TbOR0ETvq-MnuyXVc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-xYn2sJ1grt60XpMjYkQAQA1CAUOYSdG_q5zcwKqWlsM.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-dsKdvzpwTKjeXmNQXv0FEvWtj_oqnCgJ3nI38YXztQU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9jS-bMecGsPTiwkVFpecFpyyB0VfZT5tf_Z4X-Bv_u4.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-oVTGoIzIMhqiKZU2ySauF-lSCXqZ92mPpeofR3tlTV8.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pJjbKkRaIreKqWKO03Atd3vGIgx-t8fQZyzYZvxUHEg.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-healthcare/v1beta1-rev20200713-1.30.10/e91837f3ef70393435477f24da1d8037079de665/google-api-services-healthcare-v1beta1-rev20200713-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-healthcare-v1beta1-rev20200713-1.30.10-JURWP1dWIZvDBGhzx49rM9wlx_3gzI-7LaDigoy5pnc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-bigquery/v2-rev20200719-1.30.10/ef24b5e53a2e3133e8f3a48a5c7fb97c9a20efc9/google-api-services-bigquery-v2-rev20200719-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-bigquery-v2-rev20200719-1.30.10-UO0rHodDXrS1DVt-XgTxulehStfKZiU0mkIbK9uOQ34.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-pRphU9STc1m-sUagrEKPhtRklUBiBaD894bJ0M2v0SE.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-clouddebugger/v2-rev20200501-1.30.10/b33ff2dddd08848ea3f267bbddd636554eab43d8/google-api-services-clouddebugger-v2-rev20200501-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-clouddebugger-v2-rev20200501-1.30.10-bU7wrCFf0eFwDFdYM5XXmUMgLpTGll41LtaPtLf_5Yc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-java6/1.30.10/bab94eb59b41c2a0087ce53588ff33d7af15c13b/google-api-client-java6-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-client-java6-1.30.10-m2I4oHqNXjz0pJ-TP3zK5_cofaMLXRJwTHexGAhXg04.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-storage/v1-rev20200611-1.30.10/189adae8ea043984d3814717203b470ab0517373/google-api-services-storage-v1-rev20200611-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-storage-v1-rev20200611-1.30.10-INrcnpWLWvkN-Mf-ig-ab-lYg9NBuztZDHchmNhVCVc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-9QEEC7iYYk7zP7-hOsXgs2Kehcq7D0Lvwvzbq5iHbOY.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-dataflow/v1b3-rev20200713-1.30.10/9c7cdb809d0594a4c3347c299c85d89e85617769/google-api-services-dataflow-v1b3-rev20200713-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-dataflow-v1b3-rev20200713-1.30.10-7G9SCylELI7xiLyl0DMKUipZJEhBA78X2SidesnpR-E.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-jackson2/1.30.10/9e2d0aa4eaf242bd76668afab8da3a8a8f18cf11/google-api-client-jackson2-1.30.10.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-api-client-jackson2-1.30.10-VpK9R1T0BTfZBKYo0BYUAFSlh5Bc_I_Q_fdy5_Z9rTc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TKLcpI4nF18BLOydmOl4Qb3C8tk7nmL6sAwgH7_E6xo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-MAFLKokIB9dNmD_LOoNwYlMAkiW2JB7q3ffhK56jl9Q.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-RqBrqXgeyq3KOJCBVPdxvj5Qvf6RadZEIHG1uyJyqsA.jar
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 178 files cached, 37 files newly uploaded in 1 seconds
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 064087aaad7040fe55e5c7e4dad4e4ac3e1c5313b042327def1a179d76d692a0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BkCHqq1wQP5V5cfk2tTkrD4cUxOwQjJ97xoXnXbWkqA.pb
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 20, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_23_45_16-16938772797866909794?project=apache-beam-testing
    Aug 20, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-19_23_45_16-16938772797866909794
    Aug 20, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-19_23_45_16-16938772797866909794
    Aug 20, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T06:45:16.683Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-08-20T06:45:21.881Z: Staged package google-api-services-cloudresourcemanager-v1-rev20200720-1.30.10-PQqVlsOIu2M59XLbEl59CW6wNmY5OYy_R71khhZQ8-Y.jar at location 'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-cloudresourcemanager-v1-rev20200720-1.30.10-PQqVlsOIu2M59XLbEl59CW6wNmY5OYy_R71khhZQ8-Y.jar' is inaccessible.
    Aug 20, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-08-20T06:45:24.082Z: Workflow failed. Causes: One or more access checks for temp location or staged files failed. Please refer to other error messages for details. For more information on security and permissions, please see https://cloud.google.com/dataflow/security-and-permissions.
    Aug 20, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T06:45:24.113Z: Cleaning up.
    Aug 20, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T06:45:24.180Z: Worker pool stopped.
    Aug 20, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T06:45:28.006Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-19_23_45_16-16938772797866909794 failed with status FAILED.
    Aug 20, 2020 6:45:32 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Aug 20, 2020 6:45:32 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 92bd21cd-2799-499d-99bb-0057fecc79d9 and timestamp: 2020-08-20T06:45:32.410000000Z:
                     Metric:                    Value:
                 fields_read                      -1.0
                   read_time                       0.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 6:45:32 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 29.202 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 15s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7okw5cb2wvoqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #892

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/892/display/redirect?page=changes>

Changes:

[chamikaramj] Include coders embedded in CombinePayload when converting x-lang

[kevinsijo] [BEAM-9919] Refining xlang types and wrappers

[Luke Cwik] [BEAM-9979] Support a reset call that happens after the bundle is done

[noreply] [BEAM-10200] Add optional experiment to enable heap dump through the …

[kevinsijo] [BEAM-9919] xlang transforms wrapped with namespaces

[kevinsijo] [BEAM-9919] cleaning namespace.go

[noreply] [BEAM-9547] Implement dataframes top, join, merge. (#12516)


------------------------------------------
[...truncated 292.64 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-IBKEL3Ss2VLQdSe7aWRQz4zO3tEb7TZTz0rfWd0XLDI.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-C4xqxQYPoySxxhB2TXx8HD-4muvmX-epOpRnxcDpOhE.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xiMrX90_G-l-QPQoYVpO__iWS4qgiRwsrG88NsC2TJM.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-vl7_okSIT05Gf10Id9qDbllWXewkHjdGjY7HS-KGfnI.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-JM-p5QTTwgnQW3ixKyfCMjkaxscEXZAbg964epsHnAA.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-aiijV62NAJMLUlBt2cw7ZHHDy246byozlyqvXsEXo-s.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-NjIA6TSt5ZbnxbuujGgfVd2hpM_DXsUobTLrIcsHUZY.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-o8mvSwiHJy1LxSGktWyFrMlIzqz4-glEoAFQr9inlQo.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Zb-fLdMF9W1kGWO4WwItRlWf_vDYIYq8llhR8r6Q0ZU.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-2jwRhZjukLPwAciaUo2sdnBCt9Vmif_z0I_vd5IN6kk.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-1pMF6XDKcb1WNSTJATp0914BhqEXB0kPiFRffe1nbkY.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-GVu1HnBwqJYIKxjfHrZnMrhbGyC-qlmTWJntCnQrVjo.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-CNL8x4RX8Td4sLdngw8EMwZgpDiuDm2AVEMvUqGtMRY.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-IBKEL3Ss2VLQdSe7aWRQz4zO3tEb7TZTz0rfWd0XLDI.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-h1lG0rnCAmMhZVrjsu1lj6iLInnJithW2iw7DohGK3c.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QkYHS5Bv5Snn8mqhBMWDqgLtfedAuiVCWQiu5N2FTf4.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-F-WG0ay-2mjenRT5O8qgJMbTtEy_AE4V1K7lnS9wIzM.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-1q1dHVFT8lC86-dnM6wrPkN_XG2Ncp6Gj7BfSEg3XEc.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test220898642873210943.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-k2Y-VKSG9JbeQBa3abcGUGlTu7rAHdyuM5uKYrabrvI.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-KQlSVNOOpp2oY0MJkGam4tOPjUdYW0zdmFkjWk1pBMw.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-F9jKieWVy4vETkdbKlOYT0RC-GEpBwBrlOm2CrPodOw.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-VHyAb3uaRo_SRneVN1nI4vEQQrdMsVzqBVE6pE38nxg.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-FmF2tyWBN0fMOqnp5ycS3ZGfkQlNvu_n34Q3B59jr-4.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Yihd_07pcdBbGNiPw0x2HuojZj1zdrZVTJgYq_xr4rI.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-qQN-qtn3EOmknu2KrB67g1dDiIpgny0RuKtf1gD8ZCw.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-1b96AlvNWmpxCgr4tlQnGMMVlH9Jn6VG_2jg7qM5B7o.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-cJXHpNgkrLJ84_KmRi3dx3TEu4dPqfAZA-cBChpyvFg.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-o8A_vPS5ukzIdV8aibMLs4_7hWR1xQzoFbMLC_fyKzk.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-4376nIrlK8Ev4aLkKsbHfhBNJT9FOtLoMKwGGTgTeZI.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-AP1TdVDj_pqvGAj3LfS17If4J0fSISK02aMfYEo--oE.jar
    Aug 20, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-_Xjuu3PZD3BOtHESn0JZZ2s9T2LkH7GO-bSIsSIdtkY.jar
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash e2764e69e4d30d417a1c0b4467f0849f4dbfe4d9dc30c7deb6dd80760ee989f2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4nZOaeTTDUF6HAtEZ_CEn02_5NncMMfett2Adg7pifI.pb
    Aug 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_17_45_24-4768161924024860342?project=apache-beam-testing
    Aug 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-19_17_45_24-4768161924024860342
    Aug 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-19_17_45_24-4768161924024860342
    Aug 20, 2020 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T00:45:24.733Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:32.771Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.662Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.768Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.798Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.866Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.894Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.920Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:33.952Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:34.343Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:45:34.426Z: Starting 5 workers in us-central1-f...
    Aug 20, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T00:46:05.843Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:46:10.238Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:46:33.768Z: Workers have started successfully.
    Aug 20, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:46:33.831Z: Workers have started successfully.
    Aug 20, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:47:08.836Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:47:08.989Z: Cleaning up.
    Aug 20, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:47:09.065Z: Stopping worker pool...
    Aug 20, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:47:59.401Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T00:47:59.436Z: Worker pool stopped.
    Aug 20, 2020 12:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-19_17_45_24-4768161924024860342 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2d002967-709c-4eb7-9204-d8eb78d856ac and timestamp: 2020-08-20T00:48:07.237000000Z:
                     Metric:                    Value:
                   read_time                    13.659
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 12:48:07 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 57.058 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/64nrz3uzwk75u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #891

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/891/display/redirect?page=changes>

Changes:

[zyichi] Allow Nexmark launcher to publish human-readable events to pubsub.

[irvi.fa] [BEAM-10753] Add Slack link invitation on README

[Luke Cwik] [BEAM-10756] Fix empty pull response to not ack and to not throw

[noreply] Merge pull request #12597: [BEAM-10685] Added integration test for

[ettarapp] clarifying unclear comments

[noreply] Update README.md

[noreply] [BEAM-3301] Adding SDF Go Dataflow translation. (#12629)

[noreply] [BEAM-10752] Use TestPubsubSignal in PubsubToBigqueryIT (#12625)


------------------------------------------
[...truncated 296.90 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 19, 2020 6:49:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 6:49:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:49:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 6:49:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 6:49:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:49:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 6:49:44 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 6:49:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 6:49:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:49:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 6:49:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 6:49:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:49:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 6:49:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2020 6:49:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2020 6:49:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2020 6:49:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2020 6:49:54 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-TZhYN1bIzxcgSe5fod9sMnPRx0ctrGfFWxUK-GQhrxg.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-tSB5-FmR6y70GrpPaHvQdHeRG4S69aUuELLcJsM1PqA.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-XnpfFsyHfXNZhtgwK9z3Y63MxJtdmpcx2bQSj9lI84w.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-xumpw-FffCdgEN5dZhmLdaTyaV-aBtoy1sXfv7RUpJY.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-15mnQGLJtkeZRr9EAn2EUJS9vlZynCp9xSzLOueDZ7k.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-TZhYN1bIzxcgSe5fod9sMnPRx0ctrGfFWxUK-GQhrxg.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Ll7T9Cgw-yOdksCl1hpkHZsO_NHwM25HTJifY7erZAU.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Fn8hnsL2LQnQkeBr-i0U5snf6JgHcUOZ37SQCP32Ie4.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-kDb2RGyjj4YMGyIKCxjRcNjYzpDCN3RvoSZ9XmdUL64.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-v0f7Impd3KgIPIeaPRQ0DJtvVAXuoDoGKb8ZHRhq1lE.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-VD9n63S66HLjjJFkZ_hm8pYfTCvq6xV-bZ7b4nYqIgU.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-27ffg7Pvyo30LhloFzwt6jhl27Ybxrqlwz0Q9ht-Mz0.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-eMj214S8SoZ23_qUuTO3CtjzTfhmP6TJNqvmGF8aJ3k.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-f8LLuFDbx3RWB5CUxhIEqePB-nsM-MdWM0OeQ8ly0J8.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vxpZA9MWtXZg-T7pnAcLFFng8fshXMxYEBeR-gpSqIU.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-pX-KFwKq2Sg31DnezWGuhaL-qn29-N_TT4n8lBLbdZ0.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-AED8hoQdHfisbvA9CAJ15gAHiED9pXFiWoTQ6DDVjfM.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7023854162250129648.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MZIVcT7q4MnBmJVW5pBoCTnBPsZZnGZFCO6U3MhQPqY.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ANodPW_k7vZcZMnj4z-jA6WZVPx0zwGTlaAPzbgHTFQ.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-qULdbL1Wy8rdrVYr41ealkha8Mdan7alNU1F-b1bMP8.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-5uPTZFALJdc5dLhMjGiQDLD770Va4Ferl_2NJqmJNQo.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-3viS4EzRsLIY7z7iw-ao6cuy99nCqy3MUBW9dEkO088.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-bN_v9WodfEd-D2ktuYasHhi1q1XN9BUVH40n-y-owIA.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-1hrON5pQuvqrXoiXH5mD1OAF9_fTaewk3wLgye0YFLE.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-xOtuf0a_HBwTR-bnq5_8yFFHiB-MWayakz-g40hhObM.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-ggqQVvGXPYnlvIhGzPI_4J8tPo0KCa4Tfp0oXjVq7os.jar
    Aug 19, 2020 6:49:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Wp-3GFC_MrJ4muZoc6CyNJMwR_9WAYsFkMgw8xrY2m4.jar
    Aug 19, 2020 6:49:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-b1w45AL6lyeO7f472dVcsVuDyaaqW2m5CSf1UZtWjMI.jar
    Aug 19, 2020 6:50:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Ow7sgoKoaFa_zw_nP4I2xx6Z5Tv6fSb0druF0G2Wcbo.jar
    Aug 19, 2020 6:50:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-f-6A9fl2dZambdKFCsc9php67WoJ5qckrqP0GCimx0M.jar
    Aug 19, 2020 6:50:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-KH05eTF0urWQjxuPHZiZQOWvFf6W89bd5qKLsJWNIg0.jar
    Aug 19, 2020 6:50:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 7 seconds
    Aug 19, 2020 6:50:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2020 6:50:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2020 6:50:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2020 6:50:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2020 6:50:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2020 6:50:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash 73652b5c23f9d1e51b839fa17199a7f255b2a5e75651e1c20cc00906bf863f50> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-c2UrXCP50eUbg5-hcZmn8lWypedWUeHCDMAJBr-GP1A.pb
    Aug 19, 2020 6:50:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 19, 2020 6:50:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_50_04-9068436304700300052?project=apache-beam-testing
    Aug 19, 2020 6:50:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-19_11_50_04-9068436304700300052
    Aug 19, 2020 6:50:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-19_11_50_04-9068436304700300052
    Aug 19, 2020 6:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T18:50:04.139Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:13.007Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:13.851Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:13.897Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:13.926Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:13.996Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:14.023Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:14.053Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2020 6:50:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:14.081Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2020 6:50:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:14.698Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 6:50:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:14.785Z: Starting 5 workers in us-central1-b...
    Aug 19, 2020 6:50:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T18:50:21.387Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2020 6:50:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:50:44.063Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2020 6:51:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:51:10.470Z: Workers have started successfully.
    Aug 19, 2020 6:51:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:51:10.496Z: Workers have started successfully.
    Aug 19, 2020 6:51:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:51:48.524Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 6:51:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:51:48.647Z: Cleaning up.
    Aug 19, 2020 6:51:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:51:48.722Z: Stopping worker pool...
    Aug 19, 2020 6:52:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:52:32.938Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2020 6:52:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T18:52:32.976Z: Worker pool stopped.
    Aug 19, 2020 6:52:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-19_11_50_04-9068436304700300052 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dfc5b34e-c62c-4179-a03f-485818820bfc and timestamp: 2020-08-19T18:52:40.993000000Z:
                     Metric:                    Value:
                   read_time                    17.831
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 6:52:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 3 mins 38.173 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 54s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/p6kq3gsgmmryk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #890

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/890/display/redirect?page=changes>

Changes:

[tobiasz.kedzierski] [BEAM-BEAM-10667] Make github actions bulding wheels bucket name stored


------------------------------------------
[...truncated 293.61 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-OUb14jU6PTGp7jAIl53y2ObV0ZLYfaj8bLYplhitvcM.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-QWgkV1Q5fw4wU9PkFJvia9eJhrodtk8AiO7VbsPD1xA.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Zuy3wITbWk0AjIYExFmFTtsihh5g5KnHvmYZgAXUdsI.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-7mgzegqhsK8GB9VXVTpG7XzonJK0-gMZ-FITSmhTjvo.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-wzrMLyp8HbI7T1D-6UE6oldBX2u9kHuPQpm_xw33YAE.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MXkpBOw5Bnf8ZtmMhYSZmLltFUL36VMb6UAzd2SoZ04.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bD13VsfI8wwKTu6L_ZfENX28WE1nvXvcCY12EAxgI0o.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-eltukcUUadMSb6NOO7_uUVUiwm1DH6GMP725g-vYZ1o.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-wQeVxQxnIyJ0V4x48BAsQGSWNn6Ucu4a2ulZYKEX7Go.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Zuy3wITbWk0AjIYExFmFTtsihh5g5KnHvmYZgAXUdsI.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-m95ep1-cfoJTxXAKVP1Uu9_5QANXLK9Kdbt15ldnRpw.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-o3yKxVcJexwr3vM3RtGoEwJgJ6yMmtmh6IZy90I-v5A.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Ca9Iy8725PC4FWhMUjkmX3t8kQpAWoJY_l-GRIo6nfo.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-xYb3SBsuz2Ic3gs1h-yfBC46erjlARv2a7yl-uVRCzw.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-P1VbPB54vx5f43jhp_gwOYhtzJfUvEu3CaMmKK_bplQ.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5370658620663938291.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0fxEmuiS7a3R3bgzglISQCmNB02UA_vdCUrfglaX-2s.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-os2EteHLHVPg7X6tgbxK7xB0Y3cpbG63zOQnUcEpYsI.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-dCxY6NQYnFVkF5q_6vg_pskm8ehrxO-dSJU1OSkFpeQ.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-GFozBfTE_4wYlr9vPdZhD3xBiojgHlDah1N0ebb3CMs.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-j0CwtdpmW94lYwm-k7vfmwmASYR2VR_C15ZjobQdEWc.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-oX0O5AzHGy03rKbrpd1WcpLOQfK_5GMKgxcSItjiSdA.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Gzg1tW52fB4XDLgYNzq5Lt0pin2RF86f3Qv9HH0juu4.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Ipkbqpz5Y0Veg3IFs1K3B14KoO9tj5XJNMeRdURdfeU.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Rd0xCeWYBhuFGCw5E0iflWit1X69s3r2lo9casQENLk.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Y8lUaSQRCpv2ApoVh0LceiXMaZvfhKAiiOo5PIzkh0M.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-36_G1-cS5QW45yfRao0lYO7ClSwHE0wLc00gVsnUYPg.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-nrcMbopIug1IcrkGreQMyP_H9WzMDWCrRD9jSuxEqVg.jar
    Aug 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-heM0FLjl3i5AdOdvPnIoRM2IRqWnI-bCQCnSLVb96e0.jar
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-aY57c023pwYGE4rQBH2rskNyhiQT6SppfhwG5sArmo0.jar
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-TbBXwSdaFMEGCMYwIap-X2xXWVyPtQlpu3CC2Irsdvo.jar
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-jgn01QR3S65b4tFzAkZJICu1d-bXGC0Bj90dzAJ0kEY.jar
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash c69ac2e4f363f880042bb235cf5c877dd6869c49bc57987219dbdad8654e1483> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xprC5PNj-IAEK7I1z1yHfdaGnEm8V5hyGdva2GVOFIM.pb
    Aug 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 19, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_05_45_18-10417889275630777521?project=apache-beam-testing
    Aug 19, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-19_05_45_18-10417889275630777521
    Aug 19, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-19_05_45_18-10417889275630777521
    Aug 19, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T12:45:18.361Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:26.145Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:26.848Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:26.893Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:26.923Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:26.999Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:27.024Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:27.072Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:27.105Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:27.486Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:27.568Z: Starting 5 workers in us-central1-f...
    Aug 19, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T12:45:52.127Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:45:55.220Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:46:11.097Z: Workers have started successfully.
    Aug 19, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:46:11.124Z: Workers have started successfully.
    Aug 19, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:46:50.002Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:46:50.157Z: Cleaning up.
    Aug 19, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:46:50.251Z: Stopping worker pool...
    Aug 19, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:47:40.753Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T12:47:40.786Z: Worker pool stopped.
    Aug 19, 2020 12:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-19_05_45_18-10417889275630777521 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b2ef45bc-a372-4080-befa-7fb9bf0e734d and timestamp: 2020-08-19T12:47:49.950000000Z:
                     Metric:                    Value:
                   read_time                    17.619
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 45.13 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/nyyw4qq4qxrau

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #889

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/889/display/redirect?page=changes>

Changes:

[ekirpichov] Add enum34 to manual_licenses


------------------------------------------
[...truncated 294.63 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hkJpni8CDDlxcCAuyzSnJNrtrMvpEUbiBOBP66NmXcI.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Lv0lVIu7afWaOK2t6pScat1fL9KmnE1LjkrJXJuCIic.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-kTEinO9ddcQ6C5qzjhDVC9xNM-KLQyq5zj-_gpW-L0Q.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-N9-gjSlkEH50trsPumUKgFltxktIE7NH_7rbaJX6KF4.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xO3FIqiRvyi419Rx9eV_sLlfyEON3nlb5NAJNsvtDuo.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-e-32m7Z-MHKthaOXQOczlRgXKgPFVHpNytzIv59hQyo.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-4I20ItHt04GEquUXrkLWFX8rYqSPCOo8xrL8UIX19IY.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT--akndtC6Y3AFYV7nzw7mhaauteIQ-uQdQQ0tkzP8zZs.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Npdrxu42u8J2S8b4r9k9AoGTJaXRFAIZSAMoaSJuMeY.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-0SEblGwiz3tb3f_CO3hC0YdSjMinSfdwoFXYdHZkYxQ.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-4gBMqhu7l2fKydiFGH4gjTA6gGsRewX2vICsLb62lLc.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-H0LXA_V3QsiY30DQKDXNJW7yVn7spYtm88oqAEp5tYI.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Ga3n__pSpXFCUv03kHjsh5p2muIsEf4S6QoZZHRB47w.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-aRZIOEcLrA-20cazlLN4lT22PpoD7orePqdWbWBEoCg.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Scut8dKqVQl1qS_XdND4MY--xGh9OKvoHbjkUSIa0uc.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-V-cqmYlfjHQ-Q1n86hUIHlvIBdR_Nb0Wc1tVc4uEXY4.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-OY01q6f8IDv59526CBBO5CnhAj-CNwvXlt5-OO4JPQA.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-b-zxjXTtPda857johlZf-kAmLIYLjKFeYEGMaMynMqI.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3318380284740842799.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-alWc3Xn0ab72AFwirbIKiVMWzvyzaFUiR17yHKZFhxU.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-SPusDPs_OeEFyukVsefGBXhBt3O0bJycfVgzBuTDv3E.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hkJpni8CDDlxcCAuyzSnJNrtrMvpEUbiBOBP66NmXcI.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-xYk0ExWywDNGJuqjk0noz-aAt3AZvYmT_cKeR7jLvuw.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Jjul1T7lmpR0yXHMR56p_ac1-uqwOTy7TSyQnsPq0Vs.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-ciqOwT9upDYYnbGxC6dovmrTcsoiRuOT52QWS9L1lIw.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-UyT8qKNwouz4p63Hbi3WsrgvmfMriu84CwM8Y_JHAAc.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Mv-b86v73E8gEGRy7GtYY8gXJAADWcUjyzHLK9HLX_Y.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-JCDMaAl8Oo2RVvshaUqHvjYDG6OeXC6SnAHLzDt92e0.jar
    Aug 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-hIqJDB4i7yX-Yr8m4en-atfe99WLj27PrWvWZVlHLMA.jar
    Aug 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TayoSFQurFgAst7vf3a1JS4P9jBL0ehZDUy15DdN_Kw.jar
    Aug 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-KjYYxZHJHe1kWTwtWBDuSlvzXL7XrRca4fGVsw3tkF8.jar
    Aug 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-E2xWtONuukslEj7aGnu5lhpd1Q1ByeYNu-ggCwBUvuM.jar
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 56ac4399b423f11d7fc6f8158c3521a37dd0391e4c219cc197e4c6a9f3992324> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VqxDmbQj8R1_xvgVjDUho33QOR5MIZzBl-TGqfOZIyQ.pb
    Aug 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 19, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_23_45_40-4028865520954326159?project=apache-beam-testing
    Aug 19, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-18_23_45_40-4028865520954326159
    Aug 19, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-18_23_45_40-4028865520954326159
    Aug 19, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T06:45:40.608Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:47.878Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.515Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.556Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.587Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.660Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.691Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.725Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:48.771Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:49.168Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:45:49.247Z: Starting 5 workers in us-central1-f...
    Aug 19, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T06:46:00.690Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:46:23.978Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:46:41.056Z: Workers have started successfully.
    Aug 19, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:46:41.090Z: Workers have started successfully.
    Aug 19, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:47:13.739Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:47:13.916Z: Cleaning up.
    Aug 19, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:47:13.998Z: Stopping worker pool...
    Aug 19, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:48:15.398Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2020 6:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T06:48:15.434Z: Worker pool stopped.
    Aug 19, 2020 6:48:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-18_23_45_40-4028865520954326159 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4401049c-7b0c-4dfa-988b-097bda90483f and timestamp: 2020-08-19T06:48:24.240000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.468

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 6:48:24 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 57.791 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 8s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/xzmocjsnzpa5s

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #888

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/888/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-9956] Remove unbalanced code markup.

[zyichi] Bump up advanceProcessingTime duration in ParDoTest

[Luke Cwik] [BEAM-10040, BEAM-6804] Increase wait times to reduce flakiness.

[noreply] [BEAM-10752] add use_deprecated_read experiment for testSimpleInsert,


------------------------------------------
[...truncated 295.66 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 19, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 19, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2020 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-I6cUZ3eZeg69--xCDGfzHyfD3wSOW0i3JAFJawkeXc8.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-VfxtTPI7ZSDFJVYHmNLjyENM_RhKmBgQlmZ9_UojSd8.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nlsackERPDkQhBDrOH3PF9WJW-MLiDHgmz8H8iLPlys.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-3e5yRU2xa6UiOxhocKuQkY1Fi8y4mCEyVFVcbiWRuYI.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-sC2afsWfaXjgKp82MJbZDKcwiOh_XoGCqWa9fjsS_IY.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-z8b9DwSlUN0a4ChDu0vyQNLyoinq_j0VvupXzUDAFss.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-AfjFKWTFv3yoz4lTIP3V9-dLqYuTJYNFLn8vJnxhzFg.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-AJsE98lE1cIz5YjlKAb7WhV2egJsrlYu-FgS9EIeWZo.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-pT5kVH-Aldp9qMN5opz9SEpFmFGNE9DNShgIN5ePwGY.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-O7znst7oESMsqc1i5Co6uxX37bcPrUVOIkrB1C2Yt74.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-2dbhp20Teh-ppAQFbObAVNNIGCBysxU8n715dkGkAg0.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-YnnimCLxweWOqM_Gn7rpsOnYQAL-e4oLmap8PlTij7Y.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nlsackERPDkQhBDrOH3PF9WJW-MLiDHgmz8H8iLPlys.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-QEZYPR2emSqgi_w5Ux_Do1FoSKhD0KHTWeD0V3zgGV8.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0TufTSM0pcYIjv6F4nsd9I-4EhQxr-3VoC37tJf4KQI.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-bbxQz0OO_cuQBhQJHaHd01_zcWqeJKhZYed0RSHiuxU.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-d7jvgidK4IpMwUEfB9D0HGsCLLyoriUB2GqOeG_A1A8.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4709895065533708470.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hujRr8lFq4c9x2M2comuJaP8dk2xO0VanOgP4MmS0E8.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-GM3bxkRVK2IIXXPmhoHZnvmVvWlwRNBhYf1XdoXePkk.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-hdfW4OYAX9iJsUd1hmBnizEHGTGDTIe9PiphGiEfx6Y.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-sDNp2RhnJ_u6KkjHMNbF2PiCf459-0c1PDZGnyP-iTk.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Vpt4gf5ZPXeK1bL6SS7z6vMU539VafdeHKwD4xjxG-0.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-agdf52gFDGa46lev1VRD2_gb2iYj3lsw0q0RCqqvhxc.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-t9JCeVd430tJoI4tkZ2Cbh0VeWzsxCAeA3XDn5N1iFg.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-p-vsiEguGy7GoMLH0MgdgKjK1y8m4c6vJdjpo3Oc3DM.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-c3wm0xoipXarV1KzW-10zDLflS-QyanwliizPlgYkKw.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-g-Nt86wZEGBEwUXCvQ__AYI0WVIcTZhbrYqMP6fpotM.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-oCtohfRoVVATeTFZtqaRwJBAXoWI-O9ep5ROA8yEDRw.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-t6YKQ7-T3wMjDLrMopwUzrqWglQeOFhGPMoC0OPk9kU.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-OrnFPDoR4iBEGnO6vr4eYxW5iL6ZSr35OL11Ys6vMJQ.jar
    Aug 19, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-2nliwCvAl0IxP14O08HW1jO3EDGyylksZQcdx9f-uag.jar
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 0 seconds
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 877ba9ca706ad48a536d9905efa32fcff619b906f8babee356a6da76fb249e77> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-h3upynBq1IpTbZkF76Mvz_YZuQb4ur7jVqbadvsknnc.pb
    Aug 19, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 19, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_17_45_29-14758735847635293447?project=apache-beam-testing
    Aug 19, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-18_17_45_29-14758735847635293447
    Aug 19, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-18_17_45_29-14758735847635293447
    Aug 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T00:45:29.881Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:37.500Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.557Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.597Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.625Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.699Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.721Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.766Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:38.812Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:39.249Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:45:39.330Z: Starting 5 workers in us-central1-f...
    Aug 19, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-19T00:45:57.257Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:46:03.446Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:46:24.123Z: Workers have started successfully.
    Aug 19, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:46:24.148Z: Workers have started successfully.
    Aug 19, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:47:08.293Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:47:08.432Z: Cleaning up.
    Aug 19, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:47:08.513Z: Stopping worker pool...
    Aug 19, 2020 12:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:48:02.279Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2020 12:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-19T00:48:02.328Z: Worker pool stopped.
    Aug 19, 2020 12:48:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-18_17_45_29-14758735847635293447 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4efef23f-d36a-4013-a296-ae790f45ef62 and timestamp: 2020-08-19T00:48:10.032000000Z:
                     Metric:                    Value:
                   read_time                    21.103
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2020 12:48:10 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 53.24 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/z4e324yopq4um

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #887

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/887/display/redirect?page=changes>

Changes:

[tobiasz.kedzierski] [BEAM-10623] Add workflow to run python tests on Linux/Windows/Mac

[tobiasz.kedzierski] [BEAM-10624] dtype explicit for the numpy arrays

[noreply] [BEAM-10715] Update Jet Runner validates runner testing (#12567)

[noreply] [BEAM-10557] Implemented SchemaIOProvider for DataStoreV1, Refactored

[Steve Niemitz] [BEAM-10523] Add support for custom DatumWriters to AvroIO.Write


------------------------------------------
[...truncated 325.33 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 18, 2020 10:27:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 10:27:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2020 10:27:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2020 10:27:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pliYTNccuGYOA0Yw6I3tQw9JOFFLx6BvyY5KMqTqyFw.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pliYTNccuGYOA0Yw6I3tQw9JOFFLx6BvyY5KMqTqyFw.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-eLiudKrSIxwFjKcJ9CfDa4snaabw-ymu0w0fFv4D2FU.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-EEhyI7KugxbDfMJhZWbOPtkjHB5JJVoGTis0aTxCujo.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-K8X_Xzr4RKXUCFghRgBbMy3sn_VaXkx6951t_5EsCyg.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-qbJ22AZEGsQ3MSQ2WSdmA2OBdEH7HvbNDKBDRiB7IOc.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-szgXYnMuQeJdkzLlr4NbED-rJOThS9TYq4fn1NPhy5c.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-MWcFyKZ2EK4I7Pb49pUFdLWmwKckuLVp8p4WE_kjRiA.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-tXvxKwLDCaknHjKCnC0TjLFQHPqNwBpkhFdPM343olY.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-nEkZGB_9JxXeV4noxdWncRyErZGW93scXnaLffMF4RM.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-M6O-usAG8UYz9dDsD9xTmsWsswQ70-O8GMWweevB9e4.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-9luIH8tEG0dz7_SV8wqp99CWRlAUt251Cz3IAMikNqg.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-aiih9p55uIe7M3kN--UPIfuj6hpZ3mD-PVFWpyWJbm0.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-7fyHDei_zDb5xTs9RB2786v2kA5DJz0fd90Jqxg8Ih4.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-yPrxqalbmR74gh9dTMZe4lqvpK1YGMpaXzt0K6wSC_I.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-xy_UauwpQg6lJIcRbnfVXjM6QhZUss7PKfeVVK332Lk.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-pOWrMR45Mjiq5DFW5HkxxlBGw1rFecXaTRHPJlPnUPw.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8093593770131708151.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0JMyWpstwLc6O-RuXdrG_zP-2dZSMH3nGjrpDXJx4y0.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-jHWmSexG2QPuOiSMA6WB0SrzufPu_AK9Sjm8SQO7h8A.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-BHoG2p8QXWjiP9ILnuQ_6FEzFiAwou0JQSsEFko41N8.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-yy4K6qq2ft18r0Rcuq05ydQxc0raXEePYx3n9iK6XRg.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-trbg0cyYD-q14ATC1S_56Q9Wm25oERljagk4KOMxD70.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-p2BOWDAeW4aRPzuwJq1NaPS0Sm4HadhkzdFJVRD3Gg4.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-WwPDRsIaCeYb_1B0f-4D_KT2Mpx8MfVxIPvT-iecpTs.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-iGFsxvaUfr2z4sL3pElVxPka1VHtOCxkFYWp4Q5K7CE.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-b4ekLeu9ebb5n4-Zq5s4bj1nTc_GKMrxf3zhdsIiumo.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-EvHw4Ub1cNePDNj6LgCLUnuEL7cLjdS8GrPDhC6X_YY.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cXWJ-3Xvu3g1ErEn0l8rDN-GKIrbM2NrbasJNIwuZsI.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-oZCAaAbp1hCXbS4pzpvXIU7FwoVDZ2BZuH1zt0RFv3s.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-US2FwM9SwdKVjI7f1ALx7mgKTXG9yRMhMCke3fwxJXA.jar
    Aug 18, 2020 10:27:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-VtevBJVs4SVQ--51lj3tCJ1NplUQpzXAmiEGwvQpyNY.jar
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash f10324d7b33ce652a68c54a0d76825ab1478d9aa9b51172ffd23613d18263880> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8QMk17M85lKmjFSg12glqxR42aqbURcv_SNhPRgmOIA.pb
    Aug 18, 2020 10:27:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 18, 2020 10:27:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_15_27_26-1280490115029737271?project=apache-beam-testing
    Aug 18, 2020 10:27:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-18_15_27_26-1280490115029737271
    Aug 18, 2020 10:27:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-18_15_27_26-1280490115029737271
    Aug 18, 2020 10:27:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T22:27:26.083Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2020 10:27:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:33.908Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.656Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.776Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.797Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.850Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.881Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.903Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:34.931Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:35.201Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 10:27:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:27:35.490Z: Starting 5 workers in us-central1-a...
    Aug 18, 2020 10:27:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T22:27:40.891Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2020 10:28:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:28:05.509Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2020 10:28:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:28:30.030Z: Workers have started successfully.
    Aug 18, 2020 10:28:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:28:30.058Z: Workers have started successfully.
    Aug 18, 2020 10:29:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:29:01.929Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 10:29:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:29:02.110Z: Cleaning up.
    Aug 18, 2020 10:29:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:29:02.190Z: Stopping worker pool...
    Aug 18, 2020 10:29:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:29:54.565Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2020 10:29:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T22:29:54.614Z: Worker pool stopped.
    Aug 18, 2020 10:30:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-18_15_27_26-1280490115029737271 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 978ed7f3-7dd9-46cb-918c-595b68008de7 and timestamp: 2020-08-18T22:30:02.986000000Z:
                     Metric:                    Value:
                   read_time                    10.454
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 10:30:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 12 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.186 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.167 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 54.079 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 54s
106 actionable tasks: 101 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/w2admrgfckay4

Stopped 11 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #886

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/886/display/redirect?page=changes>

Changes:

[tobiasz.kedzierski] Improve CI documentation


------------------------------------------
[...truncated 293.19 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 12:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 12:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 12:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2020 12:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-qICxZ1GW4kjRlu_HsJ30GEO_rXx2XjwQKuR_aoRpnpE.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-YwoT-Rbkhj5U51PleXv4ZrKjoXvqV0sAGx2zABFIeRw.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-13hXXpcXvMz829fjgRfNAKiXlvBqNptUrWtWV1JOSsM.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6152621540439281639.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-k3BS8lCvIn-eA9iqvCcjqRSlAK69C_ng3MBkgYrNiL8.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-n-UDBFJibq_e1NDKCg07vPP7G7K5GFQaGpukdQvG_8g.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-SzK7tJX44tDLFleH2Tg1anY9eqMpp4Gj6_Xhq_Bpbw0.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0wYPaalpxNjwb28qXndG5PZ1o1zivceZ7Pr8YrkxQgs.jar
    Aug 18, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-WbIAnydyxSGlhSFb3gP9I_vvalSXgGvxMNog8FHYF1s.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-LQiAL7T8P-g0Z98rpyWPXMI4h2Yqfpw5V8Xl3y3l2zQ.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-QnZc92KxH1YoGNS_g3MsrPLrL3vfjyZvcvdOSNCzsyI.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-7onTKJiUJp6W67AR0ZdVFsUDvn36sAIEceyzO5E-dU8.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-00ArVUlvanolhrFZBOpoyw_91_rRJuM7LK2tIv7MUcU.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-IHsrMnu8p5WOkZwiC8EGrEBFAetsF0S_5K-5jBTzukM.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-e6h5P6C3VUD3k4Hfli3mrYqLPDJC-m6HF3edolLhJb0.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-eVJegg-gMqcOb1NNO0D2Phr9NUq198NH4UdXZU4jdZs.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-kFL6JiJm-UDCpC7Pc6k4fIKWOrszupNojsBjawBnZ1I.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-qICxZ1GW4kjRlu_HsJ30GEO_rXx2XjwQKuR_aoRpnpE.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-KKVzQ3AhtsqNu3avyMn98xcXDuGrZdcI3gh8Pf8eMvU.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Lg9PII34q3Vyllnaub6JXcPD4naZCY_Da_ZnBY9p91w.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-FfcT0jIzyYCk4nY9UWo5ePOPpd1kGoui6tJUWOGUaQw.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mmKJRSYmFdaGR8E8DCU-SBQ6VFkRFNraiwt3xFAsbFQ.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-cDEItuPUDUxmdm8XFZze1o5xT0C3InllEanor-VYsDg.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-9af9Y63ubxYP5Pm6TyhWsMf6vN9xoC_vBOXJNgvQUdI.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-L5gX8HlZVNLslex6n3pxmMMByNmN6Y7udlZHER1a5eA.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-uyk5DWdSf-5S5jLbzSfFmyVpsAnbAXkyaco3Xkdtmws.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-looF_JG5xIshpDc1GzzODhTQ-Dt0ML9J4tOziTOEaP8.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-qxhvZBZYY04OIycsZJigWFgacaDjOLJX2eL-Vjdj6R4.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-58BIFfJe8Z4h5BLU630mBxeNeH_xDMPiv1wmSFpqpVo.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-IlBVldFdr8F17MDarsF57zqovV1694ULi6mH__C1wE4.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-q70QOMlVUlUtbO2ChyTjPsMubdOE8t00rRhMTDwCTLc.jar
    Aug 18, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-DAX20K5Af1OOhm6IB0rpGgMJ5_lawm3FQH-sd9c7NHc.jar
    Aug 18, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 18, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2020 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash d816d4580cff896829ec4698a70979a104ea6575a94fdd1bf96c645ae1ca045e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2BbUWAz_iWgp7EaYpwl5oQTqZXWpT90b-WxkWuHKBF4.pb
    Aug 18, 2020 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 18, 2020 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_05_45_33-16422588806574250870?project=apache-beam-testing
    Aug 18, 2020 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-18_05_45_33-16422588806574250870
    Aug 18, 2020 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-18_05_45_33-16422588806574250870
    Aug 18, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T12:45:33.476Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:40.889Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.655Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.703Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.736Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.832Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.862Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.899Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:41.932Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:42.417Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:45:42.487Z: Starting 5 workers in us-central1-f...
    Aug 18, 2020 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T12:46:06.159Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:46:10.180Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:46:35.692Z: Workers have started successfully.
    Aug 18, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:46:35.722Z: Workers have started successfully.
    Aug 18, 2020 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:47:08.621Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:47:08.785Z: Cleaning up.
    Aug 18, 2020 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:47:08.873Z: Stopping worker pool...
    Aug 18, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:47:57.286Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T12:47:57.344Z: Worker pool stopped.
    Aug 18, 2020 12:48:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-18_05_45_33-16422588806574250870 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 35b9b23a-4a66-41f2-af1f-b032227428e3 and timestamp: 2020-08-18T12:48:05.076000000Z:
                     Metric:                    Value:
                   read_time                    14.507
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 12:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 47.303 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/4vbcalvgezjfu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #885

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/885/display/redirect?page=changes>

Changes:

[heejong] [BEAM-10679] improving XLang KafkaIO streaming test

[heejong] fix formatting

[heejong] fix incorrect coder issue

[heejong] use Create


------------------------------------------
[...truncated 294.90 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-cHjF-jFmMdgHN833xCay44_6vS_EpTUf1EEV06ooHpU.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ZV6uDlvk5Z_rsqBguUzoJPEAQfuDkWMEgsNe21EZjJg.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-vNGOALKmhOGJ56ZzQSFIlbHa_dKKvkEJANzBr3rItS4.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-GZ3_unAwoQKD31OXVHqYbLhyIzygNKdNY90qXDxuXWc.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-sTijaxLtTyEVVoeQ0jeXUDFGqlhZIyklqn2QhUpMNsc.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-cHjF-jFmMdgHN833xCay44_6vS_EpTUf1EEV06ooHpU.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6ssm4i63bhh06T0UucFatwZEukBS9m6necScorX5Y7c.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-4S6pGSuPBgZ3QrtvjBOgMvP8xuOhiml9suLvXIbRUG0.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-2IoeFpNWxR7xB8N7CpTtbyrxS3SwSCA-sdRwLuSD2Bc.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-pHMH3mIx0xnDH5415FbY3BRJY4o4rPtMlp1jPpzJwW4.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-mUKbteNyERPQFwBVJzz1ZNXrGTNPUYNnNHLD7Bdt8Ik.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Q_-X4c00T8REoTSN6jyOIxnWoTDN7XwnlOwGUmVRJs8.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Od_d6I9NrRMcVk1KonjEni1M1SLpLKh4PJKvtaM5q0E.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-U00CG_q_9CKLRHzHfXF0iEY9Gn2SSCGeKNnujUAr0q4.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-riatlvRvzZ_47SBtLTPXlCQg6TM-85PClcjrEkSSbMY.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-cm9IyI0CeZ_DHKOwmEucsEcWBIUHUNWy4O63R5dLgcI.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-B4crMxRILpRARYzrdvX32jdSnhQdm6WJUtMZHghoOTM.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-u4hUJpM7qju-B5dy9XsjcnjgMempH2fI5bRwAyX40bk.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-SdaR3cMNik1cuZJUQOP7oTRonP0hhBFyjrz0u3hoLkY.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-87v_OjtrIU1INnxD8A4fi9KyKyyVJwc6wL875GbvHK0.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Gu8qg75MoICOFkgp7cJh4j6j44zHIgRqOdBUAOK2_qU.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8668476735243284445.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hlxWC_YdQvo9P8CS-J6jp4vBOs2ICIzeENozzZt_DI4.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-pdqdcl0GtCK_xTd3-JDWotVblXDGAczVLcx9XbtGDHY.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-9O7VakZBnQ-E1_Jd_S-coer8WZdQjy_KlyRvtEW4LeA.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-oHjx-8hUNjF_aj1uCmSqvO7ptRBFNUC2KCRw0QRUFqg.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-DVbKCBZPjuSqN2XII9q2nKSmhLOMLk5n6FZIYCBU5qE.jar
    Aug 18, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-eGZzTIvlhAppvzgOYZ0BJZ1UME674tWa1jJO8Q6xgZU.jar
    Aug 18, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Yo8K5caKo6jcB3tdYtDc_9Hh9cLCCNn6216cmAJ8EME.jar
    Aug 18, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-o23xbcbB0tv8n5lnCyNobmY6QkTwWrXYbPg29YzwNLE.jar
    Aug 18, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-bY2Tz6SM3d4GBp5dJKg3XqqJQQy4YeUYvsPNRVl2LmE.jar
    Aug 18, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-ubBbzgKQYtscUe60nNt59qFdhOUwLaXNLnE2qIlo1Wk.jar
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93839 bytes, hash def4410ea3fad07bdb45a09e74919f572611513ed61401fd31bc046182132d0b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3vRBDqP60HvbRaCedJGfVyYRUT7WFAH9MbwEYYITLQs.pb
    Aug 18, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 18, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-17_23_45_39-15142839661282990918?project=apache-beam-testing
    Aug 18, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-17_23_45_39-15142839661282990918
    Aug 18, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-17_23_45_39-15142839661282990918
    Aug 18, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T06:45:39.620Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:47.517Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.253Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.372Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.405Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.481Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.510Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.533Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:48.567Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:49.054Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:45:49.137Z: Starting 5 workers in us-central1-f...
    Aug 18, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T06:46:18.671Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:46:20.512Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:46:40.720Z: Workers have started successfully.
    Aug 18, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:46:40.758Z: Workers have started successfully.
    Aug 18, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:47:15.701Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:47:15.902Z: Cleaning up.
    Aug 18, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:47:15.981Z: Stopping worker pool...
    Aug 18, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:48:05.827Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T06:48:05.879Z: Worker pool stopped.
    Aug 18, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-17_23_45_39-15142839661282990918 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9af1d520-764e-4c2e-9cb1-762772bb5597 and timestamp: 2020-08-18T06:48:13.969000000Z:
                     Metric:                    Value:
                   read_time                    14.012
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 6:48:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 48.461 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/neyweumtnggps

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #884

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/884/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-9980] add groovy functions for python versions

[yoshiki.obata] [BEAM-9980] update dataflow test-suites to switch python versions using

[yoshiki.obata] [BEAM-9980] version switchable dataflow tasks to be invoked

[yoshiki.obata] [BEAM-9980] :sdks:python:test-suites:dataflow included in

[Udi Meiri] [BEAM-10697] Remove testPy2Cython from precommit

[noreply] [BEAM-9891] Generate query execution summary table after finishing jobs

[noreply] [BEAM-9919] Added an External Transform API to Go SDK (#12445)


------------------------------------------
[...truncated 294.21 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gpzkweKhWljbIn7DrIZEZiTkhL5d-DwmXB72mqDmwsY.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DF_oZs2ru8wPm2sW4esi5fH-_F64GdQaEd_cta0HuWg.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-m-76jvBDrjFgB_1jmxJHqnQMVKU38QdaJRGnP2D1BgI.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-r9mLOi7Ep6w3rCEQw_3cPu2CsMnTQ-BuvZu8J-ShpOI.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-muT-UIx4zXGw4K_plV4UywwxbUDzcplRduMjxs0d0RI.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-oO7zhr86UxRkyuXjuQHTg-KxIdhNxNrwoFPx0xUtyjA.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-eGMZ021vwt1E9BGquXIPJSDW-yMxJ4nAlSc4x3RhpJ4.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-cfOzw7JYxhtspVC9P63VLUb1I-TvjuNR5ko2sLvWNGw.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-5_DsLSrcTNuxz-ujwhq_buBh5cMyYrQPInTL7B6uG84.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-OuevGbNOXLBxyYN5UzaOwWy1vYfkqBjb_izJ2ORPR9o.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-WOUXbIl-FEydESaZWnFfNbbA0VgLV2lMp-26QbWEpKg.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-c2XuE_ib09l0FF-4t3tSn_qz_RuthkvITrT9clit5hM.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6152198319606854943.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xjcCE9Q_pqPtfKqL97wXZ7OGrpysDJ3xQFy_EbWKaVE.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-W7pk9t30pGTypQ_Nq69R79FhDA5tyqYxq840gKJzPsU.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-hZuBPphurToV0S3nTIiwqUiR4t1ZAYF_K6rm6jb0FgY.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-EpxPAmPlFsKHeqRggByeJjX3Lsy3audBSIxdXXRWDfE.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-af_FEIwRyXFN9Ip3sVRiCPSmD6pLQ4amU7VbbjHK1vA.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-IBtkx56BUR0gEck2Jdwia6IaIUq07orM6HDPaLREuqs.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-DmFXf3yWzx0CgmsU6Y9MDT-hcenlVX6chEEd4XXJq8Q.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-k-Qy-fVxaXqBpUzYt8Y8wpqJUwvZdz7ydvIex8zdRx0.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gpzkweKhWljbIn7DrIZEZiTkhL5d-DwmXB72mqDmwsY.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Va7seLgTPQUMpSsdQjuIMuiYNnnE0L9KEQR3BOSn8Wc.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-WRnvBRIF8HYHwrLgn6dDz9_tSn3TYLVJECPTTiDwsvY.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RpP6o3xw5Vde_j2spYRC2cBSJZP19_nFgc9GdyyvY74.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-ClnvwOAjsmvGSvuEAlIZIbDzCGNArCw9hL6Brv90J4Y.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-B5D9GE2YUolQAFlRXKCHxiAQ-yBQFlBntl8bXXMrB8Y.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-oN69LdUL7Qs9JG6BJnRkgtBaVTC35lhNKtKoN3cO1SM.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-bIAAoQRn2lLavrbycat9NAs44KBTYKC0eHECW9H3s84.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Zmzv8bIdddmgJkEB1oNkZq5eI6cHzRfnQ5p3uE4PLJM.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-n4ztygKrl6W5I5v7gbrjdhs6pD1_OPlM3RPUn0ak8Pw.jar
    Aug 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-53Esh4i90SsLC4dq1am_a_Tq0z26OqQVXu3ImsTxdwk.jar
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash fe945169a8b54be45e9745135c4c36dd4e5e3bce10b1f07975ed6f7f6629666e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_pRRaai1S-Rel0UTXEw23U5eO84QsfB5de1vf2YpZm4.pb
    Aug 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-17_17_45_42-12577428648021372635?project=apache-beam-testing
    Aug 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-17_17_45_42-12577428648021372635
    Aug 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-17_17_45_42-12577428648021372635
    Aug 18, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T00:45:42.806Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:52.080Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 18, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.005Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.076Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.101Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.219Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.259Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.291Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.314Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:53.938Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:45:54.017Z: Starting 5 workers in us-central1-a...
    Aug 18, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:46:19.980Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 18, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:46:20.020Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 18, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-18T00:46:27.160Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2020 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:46:30.646Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:46:39.045Z: Workers have started successfully.
    Aug 18, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:46:39.091Z: Workers have started successfully.
    Aug 18, 2020 12:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:47:09.677Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2020 12:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:47:09.901Z: Cleaning up.
    Aug 18, 2020 12:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:47:10.029Z: Stopping worker pool...
    Aug 18, 2020 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:48:03.655Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2020 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-18T00:48:03.751Z: Worker pool stopped.
    Aug 18, 2020 12:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-17_17_45_42-12577428648021372635 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 483b5538-c3a9-4933-8e90-cf4ad3248349 and timestamp: 2020-08-18T00:48:13.458000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.005

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2020 12:48:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 45.199 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/y4zd4vzokqlei

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #883

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/883/display/redirect?page=changes>

Changes:

[katarzyna.kucharczyk] [BEAM-10672] Added streaming option to combine test

[katarzyna.kucharczyk] [BEAM-10672] Added experimental dataflow param

[katarzyna.kucharczyk] [BEAM-10672] Spottless Apply

[katarzyna.kucharczyk] [BEAM-10672] Updated streaming commands

[katarzyna.kucharczyk] [BEAM-10672] Fixes after review in combine python load test

[katarzyna.kucharczyk] [BEAM-10672] Changed jobType variable into mode to declare if it is

[Luke Cwik] [BEAM-10670] Make key coder deterministic by using upstream PCollection


------------------------------------------
[...truncated 294.34 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-6ejSw4Z2oHYzRx9IIo6L-knvHRkqMMRn3K_4s7jO8bU.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-ZpNfXFD53FXE27SyYLvM7pnQ8xOjJuHMyotBaHHAFBk.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-jsHWHJWcn8L6AVij_MDAoGyOSS8iz4lMupM_Wb7bJyI.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-XKRBXeAYTA-FO4Bg_eo9z6FO4LTBIf4RB4XxjdsGB1U.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Pf2oQ17P1UjlAWegcb61vd6XYT20Et-BIHkOxWmBMX4.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-fFyFNCZN7VTu1Ivy8YE9ldYFCXDUPzRqgaQV8mnwnVk.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-5mg4it468af06YapZYL9NnXD5lZXToJTBGXueUCwcKY.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-AkGCOhD1uAekPs_nBEfPfHMYHmq4SHhP9sLpisErKqw.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-HIumbKUzkWga8_iyGwsWfhNn7_RWHj4ieffnNqCarqs.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-fz4utcakUoF1FWy-BJ0Ib--ZvwHWePnO95PFmawUGO8.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3541678566503360544.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ruSj22O8FmqQQ7ll3vfTwRMi4MkAzXqIkyFcA4R53pk.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-0q7LNsToOywBN_3MX5WV4wwzA4WOyG2Rz2M5sjEXwlk.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-FeE8NRP7QgE_9gXWmS2XBFMoZB1rCg2BCyP7d-eRTW0.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Y061XVunuquoNoEaJGI2IxJtPzVSEoxVeyZDvEOvY5c.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-bQsSs1hQxBkuyN-HkArRQrg7JQv-Vrc5Me_tDybiJ6s.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-rRhjHRvae88vKObhING4qdECaPjZVH7YVj6pdhff7fU.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-4maMplYkzrwnPrdDLvUioOPYYAWyi5G3idTe3SCBQGs.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-bighP1Jfv19Nj5FBDE0BpSPgQYDOl7kYUnEQaml5dDU.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-u-fo6gzNPJnBYL3cVDQko9l9zHZKhrAN8PHDdbhb4bU.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests--ULufOpXAej80uq7ODcTG1KDp_n1m4Da3UVkLdoE_Fk.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-krzwRiv0zJt3PAF7TqYMgxHWAJ5ngocGnYLnXcPzLa4.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-6ejSw4Z2oHYzRx9IIo6L-knvHRkqMMRn3K_4s7jO8bU.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT--nBerWSAgOz2Hmp7MHcK4xDAG7UEBWftMoxZEYwqQDE.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Q8L1HJsT5KRVKliPBsa7IflOrT-rugzmmqEaA7y_aKs.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-eUK26PJNpxmyB0HiDG7e6DQghb_yoOuaPFW-903LC7Y.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-phvCJaA8AkD1qNoO1-63nQRAfWsfs8foWvRFhoAGm0U.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jYqsDQaMrjsNn1ozQe4Dj2rYDQqE3HDKHxhNE0L4svY.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Jpz6abA0pLS7G6STe53ltQIrj1euJ3fboKZpy5IFJvk.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-3Vby5aArmWB3T2QNxZ9fPM3PgrurTJiit8iaNsE0YOE.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-3pZWpPgTzzSzGGpU0us1MxyrFMve9k5vkNdH2Bn5G1w.jar
    Aug 17, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-m47NkvH3ZUZtth6EO8wvBFyi31IvZaEqhgfjjh_0JYU.jar
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash d4e6ef68321a140dcf2c6d1e22db61c7c253322c33fdbc4fc005260c6ccb2d27> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1ObvaDIaFA3PLG0eItthx8JTMiwz_bxPwAUmDGzLLSc.pb
    Aug 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-17_11_45_34-4556858212031065900?project=apache-beam-testing
    Aug 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-17_11_45_34-4556858212031065900
    Aug 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-17_11_45_34-4556858212031065900
    Aug 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T18:45:34.806Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:43.284Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.096Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.129Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.195Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.257Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.307Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.343Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.378Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.735Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:45:44.818Z: Starting 5 workers in us-central1-a...
    Aug 17, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T18:46:02.782Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:46:07.670Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:46:29.055Z: Workers have started successfully.
    Aug 17, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:46:29.081Z: Workers have started successfully.
    Aug 17, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:47:01.206Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:47:01.355Z: Cleaning up.
    Aug 17, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:47:01.431Z: Stopping worker pool...
    Aug 17, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:48:08.655Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T18:48:08.694Z: Worker pool stopped.
    Aug 17, 2020 6:48:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-17_11_45_34-4556858212031065900 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b4bd163a-c782-42b5-889f-e09bd4d47f38 and timestamp: 2020-08-17T18:48:15.821000000Z:
                     Metric:                    Value:
                   read_time                    12.219
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 6:48:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 55.252 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/gft6xma7trgx4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #882

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/882/display/redirect>

Changes:


------------------------------------------
[...truncated 295.06 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 12:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2020 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-rqv0_DbmzBQC6DuVFiPtT9tmKlJNViqjJQYetA_1FxA.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-yGS9MOTXRZ8q9dmce-3t1aNKoxjY7N1GVP077MK0JX0.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-xJffww7-PAsmSena_J2m4bP6YWz9nTruMjvRb9V6hZI.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-WOFXlM6AF3_iXpaQ6-LDDxxxKtFV6engt0QvE8H4-vc.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-nrd8yYLfi4s1f5wf3hWFH_jZ0o6qngq5mJQpbS-YxvM.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-feJArYJtM2XqcLIVS0ncoRbIGkXhgpuXxJ4Te5Uip8A.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-iuKql9yqG2SEPj8rCSuVvPIYtORzl-eLHpypKabc-Gg.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-nryg_XhqZsEqDPv408qy756SD2_W8gh664dCHkWP5NQ.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4166936556541974499.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-IU78DxQo0iQnh1lqVQbkzPKCzaAXUIQaaauCjg8Vjog.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-505OB5lpIlWvKI3AunQRadS96jUGz8R3KIjV3K7m2fY.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-G04-yRddq1hdwhSwzWSb_1S89Sqa4pKjFIWu1s8XVjs.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-f516yPAhmRlvZ8Rq4McX7kg4-UqvU-e-rrzZVRw2U2c.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Pm9EjwXeJu6m271RiHehPe6gj4m_H-PbKL9s9UOtjYE.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-mUGQk3vJuYjWaNIRXskebp56AiFZRULq1YfTKtg-ovU.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-1KmwRU1b3hHGIL_WoatKbULiZKWh1muoQCv4-HpqAcg.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-2BfACGX2082Vc3AKslrMiECWJ6ShtggIIm1lzIYzux4.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-OumKUTMD63z7phpz_GRDYrJPZkW0-jJkT5My1wuEWW8.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4bHObEUrfmadEO-XmATD_zc4porJA4pJDjbgl1yo6tI.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vKAYJKAVcTxqg_MYFXYtR1vM3SMReEE8U6VWbNxn5oQ.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-bVMHI83tzcLa9TRPpFQklaXbkKubhkKaBpRoNMuF1zc.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-_gotK0wOs82O7_0464aKzhNWaOiBBslTYlWhoQWePr0.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AYhNhTGv-e29cxL60RoKLdBJOc0U5W4leion2MmwATI.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RMseA6RyiDeW9BNZ1bJ7jvSi5XlOeZxbL1jsFoirWMQ.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-D4a6S-J93GBgIYiNQCsuGWebNvorGVEAWW5zfYtYgzk.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-pjYl3iDyS5nITdqreoivv2sSxt-9ekv4-V8uH6D59EE.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-IcQmALM6cI4LHM9Bk2xx0aMip3m7efMUE6yW8h20f7M.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-rqv0_DbmzBQC6DuVFiPtT9tmKlJNViqjJQYetA_1FxA.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-OISCsTiRaRjl4KGk_5RYsNc4Ry1Ke6APapX9Jixqs90.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-CRkoi97QygBGaD3Yjb-LXLUA7Y-XpQs97oCqD-tziXw.jar
    Aug 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-zEIFw3WIR0msgKtmwOep5o3oC1OJPSeaVk1Gjg9sTWw.jar
    Aug 17, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-9rjxojp2Jpn95yRDT6dYG6jFRFBnxhG6TmtX_subQ4o.jar
    Aug 17, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 17, 2020 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash b84c52926e4d5b380fcd0ec339520b0df571821fc45a0017bbf01664baae8056> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uExSkm5NWzgPzQ7DOVILDfVxgh_EWgAXu_AWZLqugFY.pb
    Aug 17, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 17, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-17_05_45_35-6290719352459916703?project=apache-beam-testing
    Aug 17, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-17_05_45_35-6290719352459916703
    Aug 17, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-17_05_45_35-6290719352459916703
    Aug 17, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T12:45:35.531Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:42.670Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.330Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.370Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.400Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.464Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.492Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.524Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.559Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:43.995Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:45:44.074Z: Starting 5 workers in us-central1-f...
    Aug 17, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T12:46:11.367Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:46:11.528Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 17, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:46:11.554Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 17, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:46:16.925Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:46:36.758Z: Workers have started successfully.
    Aug 17, 2020 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:46:36.794Z: Workers have started successfully.
    Aug 17, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:47:07.150Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:47:07.303Z: Cleaning up.
    Aug 17, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:47:07.355Z: Stopping worker pool...
    Aug 17, 2020 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:47:59.432Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2020 12:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T12:47:59.484Z: Worker pool stopped.
    Aug 17, 2020 12:48:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-17_05_45_35-6290719352459916703 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 24e001e5-880c-4668-86be-0d519c5aa254 and timestamp: 2020-08-17T12:48:09.061000000Z:
                     Metric:                    Value:
                   read_time                    12.013
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 12:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 49.158 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 48s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/g6dfb2fscathk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #881

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/881/display/redirect>

Changes:


------------------------------------------
[...truncated 294.35 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2020 6:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-a3_xMAegQwwM2zOI_gFoT2C1mg-PuRlIBqTVNyhcVHc.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-yTBz0I-Bng7Z_diHQehEsaZ11dIOtplqxSTSxfZH_DU.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-xEAi0Ow5h_haohwDrs5KyrcwOh9A7E2FAwQgJJj6nHA.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-mTvqzT3z91hqvQ_XWmMuBL1ElhP4f77nUbKIvuzT2UI.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Sje7439KQkQHkigNOuoyOMCCF95JrdeW_yHiZf77WVI.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1349402302793891010.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-pUM_iFZE-NGYk-gqns4DIgKSvvBbINIV0ZhEPkiywEE.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rvjGpcdB18skN9f5hjsRU67zH89T7xk8mjCMTtlN3aY.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-PdlaydU8MPr1LyziUBiFJ9TJB_U_ATWgx-sqv__RRBU.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-mTvqzT3z91hqvQ_XWmMuBL1ElhP4f77nUbKIvuzT2UI.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-MGwX99ukwlcvPlnYxM4vkOGuffYcCnQQUeGSMahUAt0.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8KA5BJE1WdyxG8xRoMMfUE6ivYPlwUWuENuFWl1vf7c.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-qcebyV0RyAhOTMBraCC-oWGbyvERBtv1HuwUGYKOKlo.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-PpuzdHLuwdZpTaYgvXTA5SV_nkemqY3SLwH6pAnUzTs.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-JfiLAYKCydF0ve9jC2jzfJ0HVo4HK8UWpY7aMJbt1Bc.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-74ghgleUVnyHT_5uza74nqLEtvpc-3CYlJ30SVDHdCA.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tGuvIClbEf7pV3Ws5oy24Be4YtpK4yyFZhKwzCvv5Ug.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-woxtNpO3k0dReQf57EshhYurPWlU5FcfSy4MiLpGQcw.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-LqdT5SoP6KJwmLRn_RMJzbpn-dyDL1tkQdMRD1trgXE.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-PsIO8gew5SYRgt8nFfC0kI4H9TTAdeqVoq6kYxXjStI.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7XyWchJaE6ovNZ_EE32kLXwlI8af4n3Vy6CQ6Ol18kM.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Ewy0GOwPeRkKdlTpmbomAOL2WqUUbZG9tpiYiT50m6Y.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-kA47QD5iJd6XTxsbFIhas3apPxlj_z4Qp5KWpmK_1YA.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-JqvmcdXdpbGVG3nIf7E7ZWgyGyA2CY6f2vDhyyJCyjI.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-dPwAEEYCgmIhQpqsf_-sBYNHWwKVfy2__-2GtyDFBns.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-rpwl0hd1KxD2kFGfBKR5GWqqxoK_IFgFgsIL-Jdydag.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-r-GJ_qRKWvf1MTZDAL39pBT-l6WGqnBuRmFSPEuFW-w.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-LK42Pm_TOEqoPU5uzNK7c62-T3PaGG0xzSThLSoyXjQ.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-7SpFGSv9flbUE4ZlJ2V9AOzebp_h7uI4zHGu6Sgnc2o.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-CAN1XVKfV5C2lKwkoSBdkfG45aS4OyrjRXsSKCyrWA8.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-6HDn8wh2ZCM7J8C3l4bs2F8-9g8GxaG4aFxdSp4VuGU.jar
    Aug 17, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-roEmPhxkZPeAhz0Prcccjf2wVCS_PzcWbv3JK3sP_2s.jar
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 4ce96f1d4d6092e103a63aa670b9622604c0b496e3ea94889c9337e0e8b2104b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-TOlvHU1gkuEDpjqmcLliJgTAtJbj6pSInJM34OiyEEs.pb
    Aug 17, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 17, 2020 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_23_45_27-14040165029325410178?project=apache-beam-testing
    Aug 17, 2020 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-16_23_45_27-14040165029325410178
    Aug 17, 2020 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-16_23_45_27-14040165029325410178
    Aug 17, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T06:45:27.791Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:34.916Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.501Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.543Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.569Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.644Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.662Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.694Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:35.723Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:36.129Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:45:36.221Z: Starting 5 workers in us-central1-f...
    Aug 17, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T06:45:54.860Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:46:02.191Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:46:25.364Z: Workers have started successfully.
    Aug 17, 2020 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:46:25.396Z: Workers have started successfully.
    Aug 17, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:46:59.072Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:46:59.225Z: Cleaning up.
    Aug 17, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:46:59.299Z: Stopping worker pool...
    Aug 17, 2020 6:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:47:50.807Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2020 6:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T06:47:50.896Z: Worker pool stopped.
    Aug 17, 2020 6:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-16_23_45_27-14040165029325410178 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c81ff5f2-1fb6-48e7-a0ef-f6895c1bb03d and timestamp: 2020-08-17T06:47:58.738000000Z:
                     Metric:                    Value:
                   read_time                     14.37
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 6:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 46.334 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/jn5xb4qecn7zi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #880

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/880/display/redirect>

Changes:


------------------------------------------
[...truncated 295.62 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lBx4vrkUueX6q6SB2xInan0R1XTt-Os7idubyAXt8h8.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ml_-0V5zhB-UpM8128bh1EOkWw-SP2ZXKmKt48VAfBU.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-uwt4nqwVuOQNjJM-FxWgemBoqwLTi-TYCarl9F9p_HY.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-kZP3HJfgaYsYWekGOdsVX9DFIIh2kpgsVQ63Ju8vvvs.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-ePfrV6N-tCMFHPrCEZjUXwqJUyp1eb09sxgGaH6jKZ4.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-xlFwz9C4heT_kaPU-E_km8Hwlhpwyl9yoKKdNjgZ2CU.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Q8IUKc0aZ6gSrv1YYJJfdZ3Fku3366Uc7ZM8hPKz01I.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ScQrMmm9nuvLyyuhOjWIN-ve-Ap_BoFaVjD_Ttka1NA.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-x8wOrpsOiZK-anfOqKj-uZs0wTuakfmDBn1dmd9XVKA.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-h0YNYb8T9e4JMqa0ymxkdfcdwIC5o7zQpcsPp_v7H4c.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lBx4vrkUueX6q6SB2xInan0R1XTt-Os7idubyAXt8h8.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-WTADAeTfniVwQKJAvqI5rLx-WbPzb-VhH5puaU-NZIM.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Kr9ZPe_SN2b04QEzI_efBOaNO59eyinqIEphnP1DKbk.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VTdZAPxCYvCOuGcPUgQtYnLhAvQigf1YWc0nNih1k1E.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-F4dnTHh_orLEB8Fj7bwxw1GJu0kLi_tSXHs3db_olAw.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-mtHo4YLP-FA1n9pBrMaYxNDOmoba00AIFWwi1OTF3WI.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-YSzuloLv9twhkmifL-NxmbCIx2PU0-MbM9f5SDalGU0.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-medAN66FPwPpV7YPXI-kfD7K086123tWK_0gs2-WEe8.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-kVSgGcU9NSxEW21cyyvFRIRsYVzn1k6zGv9xn8UwZrM.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-iiaMTqvQW8YhsJNASP1D2RqbXQKes954eD-4F4Y9FC0.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8416227806955573217.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vm5C9V8HNsP_bv7J8AuUGXzWjwhDHel4F8NkaSdQ-OI.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-tPEI3fKyaM3QFE_0iQg_0txx-NcpEBtJVaLLvC1imY4.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-6hdnbMHH5Lks2mKtdVZN-O3V16rsRxt7WmCGXagiVyc.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ePRufS6RlQlMAl6KN068BBz-uuLcjggMu14W0NXORkg.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-qSMvjX09N5W7ATSrAdrtwZi3naYKK4PE_e4HPs34ELo.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-SbMbuEXpuw_AnvPn4I2q9Lnq6BsonH7qL76_siDUyI8.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-9z-yN1Ydcujdp3b4MvZEg_yd_4MMHuAK2CcxLD3gCT0.jar
    Aug 17, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-in9fzc-sQVy_uytN5kYkxmQw5jPfQfjzk8Spu-lDEPI.jar
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-2_z3DeBmDrcla00g5n9P2d1A0uowLWeRT88pyjea-1s.jar
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-w1jYaoc1XvvWQ_MQxCcCEo9fw2PvemqJnCA9Vsjr4OQ.jar
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4MqGVJtvo2rk6BTVoJwkve7oIr8khVpqMaS7HLZ4-q0.jar
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash db6de65c6f7424dea55abd1eb7454aefb0e8bc95dd430c6659972d328a190e21> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-223mXG90JN6lWr0et0VK77DovJXdQwxmWZctMooZDiE.pb
    Aug 17, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 17, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_17_45_26-753594221903651051?project=apache-beam-testing
    Aug 17, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-16_17_45_26-753594221903651051
    Aug 17, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-16_17_45_26-753594221903651051
    Aug 17, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T00:45:26.153Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:33.997Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:34.790Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:34.834Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:34.875Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:34.951Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:34.979Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:35.014Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:35.052Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:35.489Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:45:35.577Z: Starting 5 workers in us-central1-f...
    Aug 17, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-17T00:45:55.096Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:46:04.046Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:46:23.645Z: Workers have started successfully.
    Aug 17, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:46:23.679Z: Workers have started successfully.
    Aug 17, 2020 12:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:46:56.512Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2020 12:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:46:56.655Z: Cleaning up.
    Aug 17, 2020 12:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:46:56.732Z: Stopping worker pool...
    Aug 17, 2020 12:47:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:47:47.676Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2020 12:47:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-17T00:47:47.721Z: Worker pool stopped.
    Aug 17, 2020 12:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-16_17_45_26-753594221903651051 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6745aa36-dd6b-4cc5-bb2e-ea34e2d59dc0 and timestamp: 2020-08-17T00:47:56.392000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.334

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2020 12:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 43.65 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/g7vq7qvi3fnt6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #879

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/879/display/redirect>

Changes:


------------------------------------------
[...truncated 293.12 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-90kgqwM6YLqjjx2OMuY60IaZqgzfWSdj4QahfxUvqyQ.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-XlUZNp0Sq1C0mU1rwhqqPsYaX4pHRSqUTJmWi4NtnH8.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-R8ymtdDvpP7V_X0Glj1iXY6S9wWkuQBttWcmO_IW_EQ.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-a-Q8eVq-RYexrnYzp6YaiNh8TX1tFLvLovrlC9y4s80.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-90kgqwM6YLqjjx2OMuY60IaZqgzfWSdj4QahfxUvqyQ.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-0WmQ2sHaLMVLY0oDVkLGaQtJwqtItho_1dA8570Uoyo.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-JTw4YqCdYckit1utub-F2WW7fPYk6fV-F1SQpzfcUuU.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-DcLvpTjtxqPs-fg3ltoq5EZtYjjcJtQOBAKd1xqdQFA.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7936757727182598064.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZoUFZHSoLCwsfNLV0jo-9sYATQsbcXWfTLRWvT49zrE.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-SUv5dnWsN3mBe3ukKnGdlaCXaZO4d6SjKGEyEZphaK8.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-6CKCtDx4KKmnB8rS8LsWWOOXF7UbMT0NRgidWxgixtM.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-qQKtk8tb3iozyNCsJW7AgnYflR1PW08Ak5fv_5XZjIQ.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-M2YSr7yX_MmcEUn5Q1kQnuCangnXmI5qXV8Njv671PA.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-wfmP_BJP_2xaDpDOSAALy5gxjYulA-LmO99U6FslXs8.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-pOae3xiyYRiR7d8yBeo5Nt05yXh51hTttk6bqpAoUwA.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-QwkaOYSZbt6aaX637I1uS3iNiN7xe2DXRExoG1CKxb4.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-dm9TiwlRKHRYnHLN585LnQaMu9bySh4LwY0l32rIVN8.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-tDzPdXRmxulyJHTWfuuVBV_Qvc2gfclcxGqYctXTgGY.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Fjdvoyy9PW9F3fuBc48KTq2c9bWmFwYxEuT6lHvs9uA.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Jc3XNjrlL9UI1x5fqBnfSxxtFuE9WGlNMwhBipzO-ng.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-nWtg8eKOmoQJwK9Mpz1rw-ei0e-k5KXo13ojq_U8agQ.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-B6f81N3T1B-mAyth0B5MWRPuYcxdG7jDZmOZgyw9Ywo.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-cfmKzneRp2KnnKli0-5oXWYtYi3MjgMe_QwWO2FtmpI.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-3jALVuEMhksqypfm04o320O6DSXzpwF4lL3R0-9SYuY.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-KroKdJdYOfketBunNrP0n395ETZUcPLGVIxRWvsEBXA.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-K73QdLmKsjqxWIBXtW44KgVQk4Hw7M761f2Tf7-yrvw.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ym-lH2vnyxQ8Flmh8HU5Qi69nC1fAol5WmSMZBpV5QI.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-gOaQ2hJr9hjhrmLv1MtWxAHj5B9xnIWHqNNfhXeUOEE.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4lvDFFhA8H5UE3AIIa98hngC_bbbcoz9Ba5sSs-tQ50.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-8uTRoKNO3-_4EDNCg1YEhy-A48Wchp0DdvgP2VEvcTQ.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-7725_XEcjS1XIM72s4KGOdDoQyEgEkKqTDbGxrwVpoY.jar
    Aug 16, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 184 files cached, 31 files newly uploaded in 1 seconds
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93842 bytes, hash ca9019dd0092c099a7a116e685eb43b04083add33ae0d78a16dd1da388f51eed> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ypAZ3QCSwJmnoRbmhetDsECDrdM64NeKFt0do4j1Hu0.pb
    Aug 16, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 16, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_11_45_25-7516485680008866094?project=apache-beam-testing
    Aug 16, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-16_11_45_25-7516485680008866094
    Aug 16, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-16_11_45_25-7516485680008866094
    Aug 16, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T18:45:25.779Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.039Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.726Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.773Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.810Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.888Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.923Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.953Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:33.991Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:34.389Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:34.462Z: Starting 5 workers in us-central1-a...
    Aug 16, 2020 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T18:45:48.781Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:45:58.073Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:46:20.091Z: Workers have started successfully.
    Aug 16, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:46:20.129Z: Workers have started successfully.
    Aug 16, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:46:49.346Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:46:49.559Z: Cleaning up.
    Aug 16, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:46:49.637Z: Stopping worker pool...
    Aug 16, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:47:44.510Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T18:47:44.556Z: Worker pool stopped.
    Aug 16, 2020 6:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-16_11_45_25-7516485680008866094 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ee6fe23b-3619-4f41-8e12-89fc1072aafa and timestamp: 2020-08-16T18:47:54.044000000Z:
                     Metric:                    Value:
                   read_time                    11.081
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 6:47:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 42.942 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/gz5vhdxq54lry

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #878

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/878/display/redirect>

Changes:


------------------------------------------
[...truncated 294.39 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pqivb6z9vJGMwFePP4gFYil9kV5wUHD0T17wCt2MFlg.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-7Jawcdzcc4jZjqCqXUjVNO08ND1Wj1lPNxJB9mdbe7Q.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-mY_w1nL4pEk14FLJrJJrvI-pwzzFNS_hC94nMUez_bY.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-X-GX8-3SuslMwzJWZBHIfYcLULW0ELvqjo_M1J5S-Cc.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-7qDC2B079PpR6RSXDGn5QWk8ntCAITt7Ot6GO40RBOo.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-LqQpL9Mjd9EQc8lkToUIzBJnFFyTsui-CQ8oRzvNS-g.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-L5NZ-oJCvtP9_-YGPVU6fCsWONrtq_x9wS3Fy1lg2GA.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-LL45mLdi8z1IXEDDbQGgPtZEqI3bjMkthMHguAHEk5s.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7TshBbUp_BQA6r4gxlTR7DILzTEsQFEpnSNpp9kzSxc.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-W0ELW7Z053qKAyEx5oZZvrlqWc1UnYuorJkFvdrABek.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-eBRt5hoppPMh_hw2Nb7kGEd4km-y8Nr4Drq9dh9PkkE.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-SgxoJ6F5hA2CdU6aWbIfJZ8bW3YafETqnJvcPMW4ORo.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-2UZfuxUI8Zy251RkLW8G29yDXuwj-6SqM1l_kEIQwpA.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5886717545648360395.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-oocNTQiAdo19ZikkTXAeNYzSmGwBCmqx5ZZJHqV83Qw.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-IDWDcebHpyj_8WsXEiggmHmhvWEafZVXyMtwRKymF0w.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-r1B1UfFcMOuPzqfSlV6cUJdufwbj2FNYGhjx6r3VUQE.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-hq-BjLdTaEIHLnWMHQW1yTz6pL-oxs57NlijacZuR_c.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-LGGjsU-vzdVu3xGhU8nwr-Cdzxy94iGC4miHry_EcDk.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pqivb6z9vJGMwFePP4gFYil9kV5wUHD0T17wCt2MFlg.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RXOlGEH3MDOgYiOOPME6464bxVxm91Y4iUF12swU5HY.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-sVJ_o7mgo0En-ZyLPGPVQBns8aAMa2kFklh1gQSy9N0.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-xrfzkvhenoI39esSISJ7ZCwt_wcNwDL9YxSuPUIj2FE.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gATjriF9phrQntHrEKxANmBSsvZ5OvIeU_YGPQBw_TM.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5EPDqleup88-ZUh_DyesKzPGShyyld7sDas0s-hlTD0.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-RX2z5-P7pHd2cz5zO33DgfkBwXo6C4AXwJCQWrcX5Os.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-09TZkxAUhArR9dkb1IsD7RanEMbsb6cpsHWBf4wiWKo.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-MDW7vWHM5vcU2P8gqmlhMPlOJz8UrAK_XNjR-nVEci8.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-zkaMd7_JQosdZ8aLgliJMlI3sVctd2IJuLmaVf9VrDM.jar
    Aug 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-L2Gerh6DcF7tDrtTkP6YABHbdr3rO6oNXBGhBGKFVIk.jar
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-QywM0859IiGstyCe8usQAw0ifpWjQeFfuXEESBUuKD4.jar
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1JzFpbYoAi5aPKajK7xjqusjkKy0r0Hv_F5qgYVrNno.jar
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash 7f00dbb4b186b647b66bd1598a17f72458ae674642ae1f1e3f918a07ab2b1601> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-fwDbtLGGtke2a9FZihf3JFiuZ0ZCrh8eP5GKB6srFgE.pb
    Aug 16, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 16, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-16_05_45_22-14901385461881618509?project=apache-beam-testing
    Aug 16, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-16_05_45_22-14901385461881618509
    Aug 16, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-16_05_45_22-14901385461881618509
    Aug 16, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T12:45:22.219Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:29.649Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.301Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.370Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.401Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.474Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.503Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.537Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.569Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.911Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:30.988Z: Starting 5 workers in us-central1-f...
    Aug 16, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:45:53.622Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T12:46:04.001Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:46:18.683Z: Workers have started successfully.
    Aug 16, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:46:18.717Z: Workers have started successfully.
    Aug 16, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:46:48.209Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:46:48.351Z: Cleaning up.
    Aug 16, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:46:48.436Z: Stopping worker pool...
    Aug 16, 2020 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:47:30.757Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2020 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T12:47:30.799Z: Worker pool stopped.
    Aug 16, 2020 12:47:39 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-16_05_45_22-14901385461881618509 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 305bdd7c-5c96-4d2f-8e9f-f28ab173c53a and timestamp: 2020-08-16T12:47:39.704000000Z:
                     Metric:                    Value:
                   read_time                    11.384
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 12:47:40 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 31.649 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 20s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7xulamtfv4ovq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #877

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/877/display/redirect?page=changes>

Changes:

[Boyuan Zhang] Remove redundant setMaxNumRecords and consumerFactoryFn.

[noreply] Merge pull request #12575: [BEAM-10707] Adding Python docs precommit as


------------------------------------------
[...truncated 296.41 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zgHZOIQBAZCBdRlE-v7oV3JYDMOxcwHWEPxjZ5YleM8.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-u1SL4Ig2lak6hhKslqicee8S2x8nxqd1QXdzkr0aMFI.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-A0rLS-9eUC-2O6WvW7UlL0ytjXdw0571QCFTAbhFjEw.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-1flseenkUkqukHvWTLlc5f6J9GB4S9WFJK1KRebgYoI.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-QFZR0ccuoK6sbQ5PyjLBgfnIR9v2qM8w00fDi0zdYxw.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-HMVkabUXRgFkm6qcJv--GCQ3V01weax7PITKd1em1Xc.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ueT6AzLkV-Re1-AgmDmiegIumNbpK-1ibFsdXYQlLGY.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-qp40OfdmROAdbeW4kpE6xIXejoPR6SWa_3ISBRxT53E.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-zst6zO0nAUMRe9TuSoRNbXd7ws-QK4oi4cDEpT8_kTY.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-R5SravEzWuL27jkVmwg5KhyWSI4lyrbfPDK9tw8nmAA.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zgHZOIQBAZCBdRlE-v7oV3JYDMOxcwHWEPxjZ5YleM8.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-lTeoUyi8MNYcnYFJuWrbm7IpB7oPsdmVIa5k1viZuPs.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VKj3qH1j7L_sLUsbvRND_x_BfCliFGIRAU1bvbPMbrM.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-zcMKDqq1gGLgZSvIhzNPUuvbJIVSjFFphNeaTjkeaRk.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-BGvjpHnM_otYBYU42J0dk5mxzTpxAXDh6Q-iJmXloTc.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-p_4UHjrEX0vsr6QHwKpiL0d3dMkUloW1Y9y5llMECTo.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-g1JkzjtZWmDdpROvK2KwG5plyanZeWP_4h7rZ1dZOBg.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9086190872665844320.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-rBX9oRrJPHx0F7ELAj2tkgsgfdtJgNy8Rw5DsaiFKsY.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-3NcTRIGUSLVo2a0nVF_LI6_Q-a3Pbj6yx6QjCZbbbZY.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-y5VKywG2pTZJ0rvtgnOaD8R5HLJaRVQ_3BYpWRPiEiU.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-x0GkssCWqlKWicJ2GRpWmzjwb9xiPpHJTvDlMKHe1HE.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-YmSXKFqfi2XLwSH96HaCZ0IfK5qurZyzwgoqtuUOgcY.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-1NzQv3ycgL9YxEk7sFgmTaTirBoDTGL_MQlq1bEVBvQ.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-yKFJQQ2v91NUzVib0cQK-hxnRtyBXmm0Kq0CAWHz2oA.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-8OEGWaTXOv0un2OS7AfS0wW_bJHkh2055fVnYxDDWXg.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-gkoUq_JTSUJz8_yk5Ejr7aiYNYfkarYERHqbOR0j98U.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hSnay5zePjQq18Qkz3R0x6uMLwLhCUcXQUOoiBSowWo.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rROvQCXRaCyru46O_WlA1T-K25HTvSCtjwXevF6inCg.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-xnQJ5cSd_0i41NBu6c-EyW_pqjnwz0bXtTQLmqZI4Yw.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-JjJ_zRUVD5pc-1UZ3qX4EsqOyLqUH26_opPNF7TFBE4.jar
    Aug 16, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-3z5oW6idpltVYPxsG8U32WSD0kotSEeWgt0OAKllJwU.jar
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 0 seconds
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash a19ac8625a3bae000643467ff49c44d56488eb112dba74bd6939b76850902937> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oZrIYlo7rgAGQ0Z_9JxE1WSI6xEtunS9aTm3aFCQKTc.pb
    Aug 16, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 16, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-15_23_45_32-16770282450050100501?project=apache-beam-testing
    Aug 16, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-15_23_45_32-16770282450050100501
    Aug 16, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-15_23_45_32-16770282450050100501
    Aug 16, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T06:45:32.602Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:48.955Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:49.776Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:49.849Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:49.881Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:49.951Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:49.975Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:50.009Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:50.042Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:50.342Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:45:50.437Z: Starting 5 workers in us-central1-f...
    Aug 16, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:46:14.165Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 16, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:46:14.201Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 16, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T06:46:15.845Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:46:38.725Z: Workers have started successfully.
    Aug 16, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:46:38.753Z: Workers have started successfully.
    Aug 16, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:46:46.115Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:47:15.174Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:47:15.313Z: Cleaning up.
    Aug 16, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:47:15.400Z: Stopping worker pool...
    Aug 16, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:48:04.693Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T06:48:04.731Z: Worker pool stopped.
    Aug 16, 2020 6:48:12 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-15_23_45_32-16770282450050100501 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8e981c6c-8bd5-48ed-af6b-4d0a10192631 and timestamp: 2020-08-16T06:48:12.583000000Z:
                     Metric:                    Value:
                   read_time                    14.849
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 6:48:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 53.246 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/zdx6gkvhb4awa

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #876

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/876/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Make Read execute as a splittable DoFn by default for the


------------------------------------------
[...truncated 300.05 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Tuc1pnA7izCw0kjBZB8EXIA9MINyQ6q5EAZMaKdXLV8.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Tuc1pnA7izCw0kjBZB8EXIA9MINyQ6q5EAZMaKdXLV8.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-hEfNyhTyAGW2zg8E--JOMO0KG82cpA4A7ju8-Z393Dw.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-4ttH-bRaDsxhP9N8TI4OCVMl3OxL6fyBUi_gFmEaIw4.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-FtqZ95pxW8eVMBxVcLEY8k_qaavYMm5rDfzaYZ8KSnI.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-aeOgSNhjWhywWeQDnr1nb_7a66C6w5FRuRHOzqVEW7s.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-drro69JOD6BKA_iHVHAZP1_6VKVCzfKqovygbMYDHLs.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-mEFGv3iYlFWS2e9K7IDKXaHCPHJBKM5gRLAVcFyeR6w.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-HRi5dlx4xyL75wOi5htkE2p23hqFaXG4r0scGOJ07ug.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Ojd-3EthGB3kSs1EGBW2iHjR8_OMsUVcYgZf946gqS8.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-PtdyQS_XqXbyrn6QYxc82H1rLmx7n6QYufoP5Db3630.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-EkNxNVTItotE9sm4FpIcW3xBtZSRGm6E2XIQQzpKhVE.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-fNF-RF_radSUsunNpbv5KSUALAyrZL9RBD7aaHaFFqc.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6888136262948981128.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MSfWNx5_qeHtNFxErFaWhH9VqIgN6ISf_6ViTdgGv-g.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-VzodCoDwdb-pF28FMWKkHxU0BudkGzllns6-cOysR2w.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests--5arzfAQXAzJx5iJrsHeW6b5vSlw9EyHT5Y3f2QmF64.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wYcuSFQUPwJeWEK418IPDbhTjJFa6ixQ0B_XDca-l6E.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-DWV8TLvdm65GdH1JAhxaccciLrgqQQVkFJZvu1nyfu0.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iuZSw3aMBA35UZf63AB45YggbVO_G_UP8vgnhb1k2UU.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-5p52hV10gnvR9SGwhT3T7vyZBcb5HvyNO4T-F3T1M2A.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-DF2Wkt9DQJzNE2gFVFlN4v82T0iQCYMOwi8FvkWw1Ys.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-OA5IuSOd3ywYDY536Hd5nWDXJ1pYvwQTL03Y16CNl20.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-3XJeqiGKSbSVqCs2ZV0wCV_G1FtdJGNUG3aTvgqA0HY.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-4Qa5HMcJQib9ltU0JvHzYpcfiFdP---ymaLLo0OK0X8.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-bH0MTH2_Rx2ApGohAJIs83jaO43VB2Nu11nW-z07UMk.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KX2OWNdjMixbuLQQt4QXtpCQ9IkojN6A9sW6wFeRK7M.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-byyUKRNI8yCdCtVd7wanRxCJuepDSYMU0-tpF2bG-TE.jar
    Aug 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT--g2q_iTcI8wy79LbiopF5V8sWwg1Cb7s1vmZLPI8cis.jar
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-SDB5gF5SQvG-URsHG6Dl6RZlrmBZa24gmBAnS4iT80c.jar
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-jeP7YP9MHKThYA41XnmU-hijNq3s_hOBPELybrvcJJg.jar
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-aVm5NQ9WmpwivZ86BctrEiQcrRIX5hAIyUx3hfR5Tv8.jar
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 2b32b02fa0a6f45a4019061f38675fbe6acdb758c979a50d4fd697bc636d2474> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KzKwL6Cm9FpAGQYfOGdfvmrNt1jJeaUNT9aXvGNtJHQ.pb
    Aug 16, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 16, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-15_17_45_47-11032065181714282985?project=apache-beam-testing
    Aug 16, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-15_17_45_47-11032065181714282985
    Aug 16, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-15_17_45_47-11032065181714282985
    Aug 16, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T00:45:47.194Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:54.871Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.471Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.516Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.543Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.621Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.650Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.682Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:55.714Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:56.327Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:45:56.401Z: Starting 5 workers in us-central1-f...
    Aug 16, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-16T00:46:09.578Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:46:20.270Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:46:39.441Z: Workers have started successfully.
    Aug 16, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:46:39.487Z: Workers have started successfully.
    Aug 16, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:47:07.836Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:47:08.095Z: Cleaning up.
    Aug 16, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:47:08.157Z: Stopping worker pool...
    Aug 16, 2020 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:48:00.535Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2020 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-16T00:48:00.570Z: Worker pool stopped.
    Aug 16, 2020 12:48:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-15_17_45_47-11032065181714282985 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 49a94be8-b84d-4ccd-b8f3-dafb59901edf and timestamp: 2020-08-16T00:48:09.637000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.821

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2020 12:48:10 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 35.992 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
106 actionable tasks: 77 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/ruv27od4b3g6w

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #875

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/875/display/redirect>

Changes:


------------------------------------------
[...truncated 293.37 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-K7dFYMk52QlzgVjxU_i9p6cPOD4eMMwcdv20vAXVsMg.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-n-zShlCnwu7oxlYJon-yDzpgEogrhQweAwloBtRp7sc.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5acBOnsdgd_tQ5J9ywr6OzvaZ5cNDTnYxejwPIcOEWw.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-5jYXnE19LUqPNeSapwNtvdHkvzo3kmMLEeyPIStr10M.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-39EjuiRZK0ewFWBkg3tD84jSiJXEFQwLUQTiYcnBGtE.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-1Nj6PPEPD2M9f4nxyiEmwQJXJ67NYAriWteTnCSNORk.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-4JeWwRdlVjblpf-Z44a4AxE4Q8WUwuV5nkwitSHuiE8.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-omcltV3O_sjlYHqqaX8CYW0flof0v1TQ-Y1wGdd2ipQ.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Weg3SjFShj8tld6QM17htenLcbGzHO1QNJEcT3lmpzM.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Ax-weIXIMwoJWqJH_-uR0XAfuXOuHac1EXuG_Q3vGfw.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-bAVzLySKbvAhyFXsO7vDcgyZTZe0J0vNsic7Fp-heCU.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-5yla_nnUMr6Gsec9VDWd3vvha9zifKXC5eo29f2GY9s.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-WLRwDQP6ZvUzuSy_Y1ILO-B9bbgTXfN9cp02eQORTXA.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-K7dFYMk52QlzgVjxU_i9p6cPOD4eMMwcdv20vAXVsMg.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Appq8laiBq5b2jUwsNt8x2HcnU51kmBnDNttJLjo9ug.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1777129475251245449.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LPHtFQoHo6cxEOxthzadGCb1dqvPDCpamdoNakrE6fo.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-IC00NH0_vsE4Y-OEFBn7mreTalZfenyDL13iFYGTlmI.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-wVjNF4y8MJGpALe7PXDWe2bXR-THKRQkamJ7YADHSXc.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-w4XRBEWzcAinKgKGCXkU5EgGF_yBit73quJKUuG_-L0.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QD0pvGljAKrlt7XgTmwJUPeU7Zdn_EZ1wyF9G1uFv1g.jar
    Aug 15, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT--Z_miYWQ5sUdLvOhQq7PaBNgi07jFbY1Hdpj52BXvNc.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mMs4cYE8owqvr5b6-bjGK2fwkPfYibbNdreXnlBW4HM.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-yJH5zH5l78atFI7J9CWPxxVFQ_AgXoNMQxj_1mLa4_g.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-u94TokowaqFdgsbmUafkdBviH7cBCyA4um_T_UhXbi4.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-rRfz9FCM7Ykj4PbVMJyOhEfF5C7cyXOswkrAqcm2DNM.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kQPWEKvaW4wXVyu4gIomIfojoV_uyfTXOMLZd2V6TYY.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-5uaDXgjKB4Xqsg488DeHo5lRjsY5IuC5-KrV5UY7FX8.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-aYLJaYquUMyMMrln8QNrjMKxgksssJ8eSHl2KqbC81U.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-soXr2Kzohiid3o12XXxf2Bs9eIsK2JWeiEvY5d74KBk.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-bxyvywU94ywfkhP_Zi0ttU3iQIaP6CaAtJkmml6H3mM.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-FouL66L-oG7YFJK3tS1w1NTCE2ocDQNDZWASsq6aAio.jar
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 3d35de856edbe948b9bf8bfabb0c095041ce7bdce489e87747c63d38df87c149> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PTXehW7b6Ui5v4v6uwwJUEHOe9zkieh3R8Y9ON-HwUk.pb
    Aug 15, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 15, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-15_11_45_22-7387258120304245059?project=apache-beam-testing
    Aug 15, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-15_11_45_22-7387258120304245059
    Aug 15, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-15_11_45_22-7387258120304245059
    Aug 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T18:45:22.560Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:30.101Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.084Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.113Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.151Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.232Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.282Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.308Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.342Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.839Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:45:42.918Z: Starting 5 workers in us-central1-f...
    Aug 15, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T18:45:58.114Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:46:08.324Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2020 6:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:46:30.493Z: Workers have started successfully.
    Aug 15, 2020 6:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:46:30.523Z: Workers have started successfully.
    Aug 15, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:47:03.413Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:47:03.549Z: Cleaning up.
    Aug 15, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:47:03.627Z: Stopping worker pool...
    Aug 15, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:47:48.695Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T18:47:48.730Z: Worker pool stopped.
    Aug 15, 2020 6:47:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-15_11_45_22-7387258120304245059 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a3e7dbef-a5fc-49cd-9f5e-d595d4d39b66 and timestamp: 2020-08-15T18:47:56.715000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.416

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 6:47:57 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 48.528 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/u6rm5pyesylcu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #874

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/874/display/redirect?page=changes>

Changes:

[neville.lyh] [BEAM-10612] Add flink 1.11 runner

[Luke Cwik] [BEAM-8025] Update tests to use TemporaryFolder instead of rolling their


------------------------------------------
[...truncated 293.36 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aYldpNO_04_uCdBHKe3Hd4UKmr1WvrNODv6EFuTS1V4.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-n8BOloOQ-qv9DjiEviicmn1wr54DZoMwG30Z35kq3S0.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-aA1XJST1DkDZHo7YB_01NOtWgDnpj1LEHZeQ7d4izHo.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-_zILCSWw_-UwnkIA8iv-GP8b1rrh277m3wtDms0AlaY.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3817968050396013266.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kDAz-BNgzPBPAT0vQq32pe7x659uTvEEgH8a7afa9dc.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QrTXWxXQ6vtCVpUoaC8V-X8iroRvU_OUUHBWy-b_wjY.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-jmKPeRq5JAy5qQDKlUSdVZfDWSRgBvC8AzsbKWyT01Y.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-br38aMMXvnKz6JkkjGZt5OE0wthjKQnmWBd8-Eezq8U.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-BzUYfw0JYIZw9qzNCT4RKDCA9sgtnn8Tbop-AorrOoA.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-BO-onqSfxUqwdj-hOxQ1SgCJY2hyOatNKQhGtZjbd1I.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aYldpNO_04_uCdBHKe3Hd4UKmr1WvrNODv6EFuTS1V4.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-1HfDBDeacoFt0W_uNDHf4Hg-ut47yO1w8L-Jg0qSdb8.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-eszdGiofLzFPkp1FM3AVHfW4jCGuDYOt0ehL2iTck8o.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-oGhNub74bVd-uESK_HNDHOP__kknTTVQkULcuGMT1ds.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-8GVnkw3oWwalZv7QRbcSxKgHTLLhSpCqF5kP8resDBQ.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-fFRMFOH6JDmPU3WuEHMzm36S-t6MyHGS2CiCbUmdHdQ.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-p8Ux4r2EiVR3MezXgvQdd6ElP-c_JhUpsl7R7bxYnQo.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-nPdrC2lQMU1QYVdOSBWcptmilhsJman3ShjcT3hjFXk.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-FIbUgE9PkQTR37oMwjoHHEELaNlgPA-6Z0flO01dqP4.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-LyZ4V05KdyOzlkkkajVWeFByYoG6Nqd0x6hXczApY34.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-TAJ4urDSrWfsBjNv2R82qjohCFrnsmJSYP329AUpNRo.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-qX5JFvoUJXtCoiPmYPWppn0s9t710EozKwjveRK0rpU.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0ChJ3yMSqETZQAPbuCM_CuH8z18CbaCLw0SaRbkeQqw.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-ca4ZHmVt9ieL8CAZwl-gCDQ7L7KDaxYQ3UgjrPsS4_M.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-V84Oqc4CSgAfTekkiDg9nA-0UHRYUbKG9WiKPM_5SMY.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-R35OEarc-l-MYpdTLd5qB4RxIVcwfv8pE7leU_Y1T8M.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ZvuQO9MTdUMTEngm2sFyB-PsEZbF42sqDQszYuZgipE.jar
    Aug 15, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-kpl3RA0i0Mtl4NokClgkdY1zbhAagMBgl6UpiN5f_ag.jar
    Aug 15, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-cBWAeiaNo1SRxdTr0Nv_2WNosVWpIiKFzhvjw9Ie4lE.jar
    Aug 15, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-LA9dSTmRGUtZUEqE3UU46ARshVUPkM9z3QljUP9wkF0.jar
    Aug 15, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-IkDo6hGKIofBskmh1FNp1AjSGkzuHyXl5Rl2K0rSte0.jar
    Aug 15, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 15, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 31641909c0593ee33ea536d74b2ebc741fde912df69cd4bb4592721053561051> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MWQZCcBZPuM-pTbXSy68dB_ekS32nNS7RZJyEFNWEFE.pb
    Aug 15, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 15, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-15_05_45_25-4031327150580372793?project=apache-beam-testing
    Aug 15, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-15_05_45_25-4031327150580372793
    Aug 15, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-15_05_45_25-4031327150580372793
    Aug 15, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T12:45:25.478Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:34.235Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 15, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:34.877Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:34.925Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:34.969Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:35.053Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:35.092Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:35.141Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:35.177Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:35.594Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:45:35.676Z: Starting 5 workers in us-central1-f...
    Aug 15, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T12:45:57.501Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:46:02.797Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2020 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:46:24.107Z: Workers have started successfully.
    Aug 15, 2020 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:46:24.145Z: Workers have started successfully.
    Aug 15, 2020 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:46:57.638Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:46:57.824Z: Cleaning up.
    Aug 15, 2020 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:46:57.907Z: Stopping worker pool...
    Aug 15, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:47:49.623Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T12:47:49.675Z: Worker pool stopped.
    Aug 15, 2020 12:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-15_05_45_25-4031327150580372793 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 29485716-6f0c-4db2-a41c-fcef6e95b4bc and timestamp: 2020-08-15T12:47:58.727000000Z:
                     Metric:                    Value:
                   read_time                    13.323
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 12:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 47.734 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/5hineq4rebdfs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #873

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/873/display/redirect?page=changes>

Changes:

[Boyuan Zhang] Scale progress with respect to windows observation.


------------------------------------------
[...truncated 293.82 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2zmoWZc-kSRwkR5GQCaMjSaYYG7cLgt3WsTV2SYkT7c.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-3J41P9SePTmg_J92tNl88lwlYdqM_l0HaPlsZicr0N8.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-lqefK4mULksr_vKkAiVlv9Dm9JTaZklMB6Mt_RaICC0.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-3npuirV-FeSUtGoOcOCYxnvGpyO_FI7Gi5r9aYv-8RY.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT--1PHZPtqt5oOmiihh414tvntrIvDKIkLYD6LOnTu37Q.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-rLtEDyWJXjVZUsy_KU2t2BmsSRL8CncWHxPZ2HTjODk.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4993012968639345693.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VjvkD3kJ1k-pMw68gLXljjt1wdMv6H6HVhVvf99oPrA.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-fI95qwX091M9uFOjnUXuNlaDsQpY6J-80P86eoC4BXI.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2zmoWZc-kSRwkR5GQCaMjSaYYG7cLgt3WsTV2SYkT7c.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-WY0CXA78ZdEodfOZTj1gwI8hlg9-dxmfpZp1iuYIxU4.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-cDqmVOUa0Plmjoh8pBQVL_oWgVcntLP0FytGv_sEw2w.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT--G9kNFY09ktoFkRa717Q9T1pmY4KmObY0VSQFbMOR-c.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-rzfSbZ-kwbVLpVkFSDk9AFz9CqmbncsZnzKXlBf9jJU.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-qv-rMpzCMDOR6F3zODVZxZAxGfsw389zygUqeUC0dak.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ZjkZ_p4gl1MOzUTl935toIjHvTQrbwjdKRpUjZz9S8w.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-r7H1eShhfFlWfLorEyL5IiXkU7Re9wN7z7X7BXPF1oY.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-xX_0g3fCk_MXUEIYHJIrl1wpuN9a-AmYErCn_6mk-NI.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Ew01rStJ-futnxdWQVzZhc-8moKKYV82EuCHdZQ3IAI.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-powSSLqbgJv1CJ2qzJlBxljhh6XouxY333YRAt621jc.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-KrT1fpKwqlDlQ1nI6AN-k94wk_Ws4rYJx_8u-i-8IQQ.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-xZmKGubMHA8sEe5RHBWnyRYMwXQB_tcdBXXdHW6MEBU.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-rAHZPptkmFu90Q6prgMW-Rje8ZX8RG7R1FTfUNLk7OQ.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-yZ_LoVaOJNrvmQiucDSCQSM0u09682s_44Kr3ex9hQM.jar
    Aug 15, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT--0peKhtjmTioIx2YSfF7RA7oxjTePZg5sTrT1Y4mDHo.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-LcnjFpQxYjD1MZc0KrnxwjKdbcnU98y9WITbxvHHxew.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-WAC4iDO46_LkFZfeTVq74YIhGb3HfFPS8qlKiPBQ3iQ.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-yi4-AGey-NEU_qhLCEBKXJdmoV0UxQ6Gwc-4NhFtX4s.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-c-v5lmplSnkP2MyY0WKEDqYx2dafA3Y3W4oRPust23o.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-qO-YzzI-nRrh3uIDv9oR0ty7lQUNZMmwjk2nxQEkUU4.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-jeovMSFJ4EbjJeFW4s-kHMRGLZjfFfxdbRbdp7sWrrY.jar
    Aug 15, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1MVl_SKL6blluW8_FWndBMmKTmwKwXusD0boOVQLRbw.jar
    Aug 15, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 2 seconds
    Aug 15, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93837 bytes, hash 59155d3c4398697b3fab96162bad02b23aa311e6683a3d42b3e91f36e638725b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WRVdPEOYaXs_q5YWK60CsjqjEeZoOj1Cs-kfNuY4cls.pb
    Aug 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 15, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_23_45_35-9541764154349251578?project=apache-beam-testing
    Aug 15, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-14_23_45_35-9541764154349251578
    Aug 15, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-14_23_45_35-9541764154349251578
    Aug 15, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T06:45:35.441Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.006Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.572Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.614Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.650Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.718Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.748Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.784Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:44.820Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:45.209Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:45:45.282Z: Starting 5 workers in us-central1-f...
    Aug 15, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T06:45:51.682Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:46:07.775Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 15, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:46:07.811Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 15, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:46:13.115Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2020 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:46:28.550Z: Workers have started successfully.
    Aug 15, 2020 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:46:28.574Z: Workers have started successfully.
    Aug 15, 2020 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:47:01.785Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:47:01.934Z: Cleaning up.
    Aug 15, 2020 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:47:02.014Z: Stopping worker pool...
    Aug 15, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:47:47.682Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T06:47:47.724Z: Worker pool stopped.
    Aug 15, 2020 6:47:55 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-14_23_45_35-9541764154349251578 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 09799dbb-5c31-447e-bcd7-850f31bbac3f and timestamp: 2020-08-15T06:47:55.774000000Z:
                     Metric:                    Value:
                   read_time                    12.989
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 6:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 35.929 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/sarci2sotdigw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #872

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/872/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10688] Euphoria assumes that all type descriptors are resolvable

[Luke Cwik] [BEAM-10670] Use fraction of remainder if consumed fraction is unknown

[Luke Cwik] [BEAM-10670] Improve splitting logic to prefer splits upto the the

[Luke Cwik] [BEAM-10670] Fix passing forward the self-checkpoint from the

[noreply] [BEAM-9547] Implement some methods for deferred Series. (#12534)

[Robert Burke] Fix broken build.


------------------------------------------
[...truncated 294.45 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ek62UPs_sFFVLMAmMLoHc95VS6vWyKvM5UYoDy6mjGM.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-JvHtgS8mnCnmCRbz-2naXuyQkQM-14AQZxpcAF6ccDg.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-QuhCEJeijPi6g6HNdy9MWT9yrTmxU4Fmwdo6zFmiZps.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-u-Ws6ADD5Irnazb1qAZpT2wtqk6gAA_xMfegzoS7XkM.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4349251151128645555.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-bT9AK3bTay-c5Ora4oTGkXdEXjrLx1Z9XKxz_tHhQMM.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QdTd1AnckvccBmfHYez1ZDDN9T5Z5mDvVYKK9NL6rl0.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-rMVyx_kQ6xj2RkRByghhMi2rzlSGYmecW_PO960P6yM.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-KnmpC45_rHTgsftV1hzLI7NzVCv4LxCR0jD_pO2QzH0.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-aF_4CG1u4ugPgtSg7XnlURMOQee79sqXRHmQTe96EC8.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-U_vlWXaVskNfA9RS3XtxYqUYMPAhLUAsJy74SrGvFGE.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Cv-Tz8n8WkrJJfx4cs5RAPxNqD-J7Fbc5p03dnZSgcI.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-w5X9ujMa2MST-VEZwvfqoi0id7fEYxG_ZLVJ5YlRslY.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-kNrNuHhSKKhr1DV4n2z8z-Ffpf2HoO694g8vhIJcVdc.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Nf75-64patHIe57F-c8FJ6RLYn4rPlRYZIaxmXfodv4.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0MSWOzu0_aAsBjJUHEAftY_4OzX6_QmGMhgsvyKufj4.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-bQwNb-yiw1oJBPnpE7WyA7aRNUEOy9XNjMYZ-pgWppo.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Ekx20qvkzomNzFmRr0ehCOjhWaaSO5RHLSCC9MU6Hcg.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-grKaG8suqFUJgpll5wVXr4UEQiD12xF69MZQucOSI44.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ek62UPs_sFFVLMAmMLoHc95VS6vWyKvM5UYoDy6mjGM.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-q2gJ8ik9klQ-Zu_75wCdoe2QH7c_hcQwnRyx0WvF_m4.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-KdSSIqQTFGchAxR2TuZADernbRSmV3pWD6_Sm2xFhtk.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-tOH4tne74Y7uN03UbxXgZKWFOr2oJnmjCfvmrnDpScE.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-f7R5xJ8wCxscPXD5vm8AnyIi47DeQx6JYLv-PCbOGKg.jar
    Aug 15, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-AUaB6nZJvNmZvs-iYqSDa-vk4t3X_OvHj9MzN2BktH0.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Y4kz3GhbgwaldgncQG88Kiilc-OZ1Aq8v1EJmwQ9Qcg.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-SAdeqO3YbwI4XGNgggedNJL-Srl-wbARoZZrnH-v0GA.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-TJf8vLAN9Rr4JWfb9r2CJ_YUOnXV9OCZWiVLitJWf9w.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-iVio9GmrL9GshrncpDe4M0bpsht1PQGjKwI6EghEV3s.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-M5FCOWeyjP1Xm5NRs4FLZVbnAVDV293Z87e1kNZnnt8.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-wArArB0SFs-HF02FUwNNKoSbXFlou4jPr5qt4qnlU_8.jar
    Aug 15, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-4gyjQUUlsFPrDchuzJ3S2_9oWEsHlf55L8aUE6Wxk3k.jar
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash ec43215f8d10e6ed639f693342bdd5e3f4cd214473ee54b493a575a12520ce6e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7EMhX40Q5u1jn2kzQr3V4_TNIURz7lS0k6V1oSUgzm4.pb
    Aug 15, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 15, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_17_45_26-15730512736366674423?project=apache-beam-testing
    Aug 15, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-14_17_45_26-15730512736366674423
    Aug 15, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-14_17_45_26-15730512736366674423
    Aug 15, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T00:45:26.685Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:40.518Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.190Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.270Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.301Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.416Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.452Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.477Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:41.512Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:42.066Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:45:42.128Z: Starting 5 workers in us-central1-a...
    Aug 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-15T00:45:52.602Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:46:08.823Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:46:29.067Z: Workers have started successfully.
    Aug 15, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:46:29.096Z: Workers have started successfully.
    Aug 15, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:47:05.309Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:47:05.471Z: Cleaning up.
    Aug 15, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:47:05.560Z: Stopping worker pool...
    Aug 15, 2020 12:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:48:06.473Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2020 12:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-15T00:48:06.520Z: Worker pool stopped.
    Aug 15, 2020 12:48:15 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-14_17_45_26-15730512736366674423 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c5dd6e75-f16f-4008-b290-8ea754c0f968 and timestamp: 2020-08-15T00:48:15.224000000Z:
                     Metric:                    Value:
                   read_time                    16.728
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2020 12:48:15 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 3 mins 4.477 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 54s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/tozdfgshw6vuk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #871

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/871/display/redirect?page=changes>

Changes:

[Boyuan Zhang] Enable dataflow streaming engine when running runner_v2 and streaming.

[Boyuan Zhang] Fix formatter.

[Boyuan Zhang] Use unbounded wrapper for Kafka Read.

[je.ik] [BEAM-10691] Use FlinkStateInternals#addWatermarkHoldUsage for timer

[noreply] [BEAM-9615] Map user types to Schema reps. (#12554)

[noreply] fixed a typo in S3TestUtils (#12582)


------------------------------------------
[...truncated 295.76 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:46:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 6:46:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2020 6:46:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2020 6:46:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Ynl1S8c0dDhg1bWaN_wotzI2c-x0QjSF3-QBO45jIoY.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-wmf-ntwxRKzSnNXKZEfYyTatnnoiO6wOAwf08bx8e2A.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-E3EaTHPKj9G1S2_X_kW9_yLrcO1W1f7LxMA4pVKGKkk.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-WLStE7ZVIxJCwhB2ik0HUiDClsx-24VMg6dTAlj8_z8.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Afqc2W7lDoEdLRtdCmuydM8ljzZB7-89nIEiMAqb4aM.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-caY6dl0DjEKQR37LZxxdI0CvDlwZABpGYXGyF6t4Jz8.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-1XTlTkziLW5xHyi80XJndQTYZne97VGiAPZgNX9IjLE.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ZCe1Hl2WNt-CPhON3ql2EuGceDZNuvoh04hnXNFTB_Y.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xELeFnj-uOOemZge3KQV5mgoFSs-rH45BC9fq5vJnCY.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-rIA1YDt0gBO8xFU-dae-MdcXyl7aC6k0IT5G1qXR97g.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-EMyWdtD3-YLjBn8VL4oN80PmZlsrfK8gX66CxnwkS_k.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Wb700oYrWRv0nS-YTxT0dgzXg6Xf3VnpF5mfRCBXVeQ.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7694816558573209691.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-d1pUhFCzZKhdqilGl-nZ5PV0IsVM3i9MOTDIqNO4fnk.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-utegzEXExz_TnOs261QloiuDYwXNYRdGUddBIChqpOs.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ZCe1Hl2WNt-CPhON3ql2EuGceDZNuvoh04hnXNFTB_Y.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-7rcOFHOwVJxa5cql6omB7-abSOx3kwMTAz5asaEiRfQ.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8STOMlS0OC9qofvz61RRo7-uYhhckUEvjEHeo-UBLI4.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-RUZlvVBbn15p2gVtm5Penmd9DXU_gt0yjKFTc5-bZtY.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-GF5_xkJjGaTHO69YrYLKrynhVCR2J_P2mARph9aXzTg.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-BS6nPifVrni5YpjmjDxNwNX3rxOCB6igUdvOo5YVqhk.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-kdUakawyVNuPi_0YtydYlYoM11E3JEaHXWf0-qe1o_g.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4ukM6AoW2XDZSwSQMI0klN-P3TC8R3E7Sx-eqopRmDU.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-WQpUEHnD74t3QxDzmC3h7SW8vUwC3yQmC5_c22OSEkc.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-JSjR2gr1TNV-4FpgawiiDnDl2raLXmB0WXqkt_LysS8.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-tU0j9WlGKsqKhgcGB8pBOP09hhBbOAiYS__Y-dDqd6I.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-WnZq87Kb96O4Z9StHSeTB5tnJ-cd20o2IfpbX8iVUwg.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-zEV7zaZGtBqfs-zXNNJoCcAB0yLPbQxf0SzD340bMDY.jar
    Aug 14, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-2D9pFp56ZO_vITn8g2eNKEq5MzJm4ckhAPieKPvGy-Q.jar
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-hVQWO2-NbyNmShl-FUkM99DWfjvADSGNLF5CMbr_iP4.jar
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-OANoJC6MYQKW9LgB7UTCmbljBs9H5-XF0RNQz1sK1Qc.jar
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-z3rKB3jDYpFTWyRf4NsRTpP2CmaYBTiWa-MF6c8P2u4.jar
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 0 seconds
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash 598873c241da801d6d57a8dee2ef70c722f8f7867ca879d2684b728664d746ef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WYhzwkHagB1tV6je4u9wxyL494Z8qHnSaEtyhmTXRu8.pb
    Aug 14, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 14, 2020 6:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_11_46_25-4463250600281857589?project=apache-beam-testing
    Aug 14, 2020 6:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-14_11_46_25-4463250600281857589
    Aug 14, 2020 6:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-14_11_46_25-4463250600281857589
    Aug 14, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T18:46:25.380Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:32.275Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.000Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.041Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.061Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.147Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.174Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.210Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.235Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.695Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:46:33.778Z: Starting 5 workers in us-central1-f...
    Aug 14, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T18:47:01.781Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2020 6:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:27.303Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 6:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:27.336Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 14, 2020 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:32.717Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:32.748Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 14, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:43.467Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:54.365Z: Workers have started successfully.
    Aug 14, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:47:54.396Z: Workers have started successfully.
    Aug 14, 2020 6:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:48:35.584Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 6:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:48:35.765Z: Cleaning up.
    Aug 14, 2020 6:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:48:35.824Z: Stopping worker pool...
    Aug 14, 2020 6:51:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:51:00.366Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2020 6:51:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T18:51:00.411Z: Worker pool stopped.
    Aug 14, 2020 6:51:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-14_11_46_25-4463250600281857589 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 73e42505-b5f7-48ca-b089-19a82bfbfd68 and timestamp: 2020-08-14T18:51:10.116000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    19.339

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 6:51:10 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.05 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.069 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 58.481 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/e6fqenmlbp2lg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #870

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/870/display/redirect>

Changes:


------------------------------------------
[...truncated 293.78 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-mCb7siRw_SAKgdDgaDG0N-Ge482QFDcUGJJQpiTbFAI.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-I4PcVI_9o5KQDfWb4MZNjoje4FVG2-suqEIOejRQDE0.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-tKj-UewRv_anYm5JBKnokYkdhTF05bfUQJNbnbrrmYE.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MqBLGFqlBpPQZdHIwjTQ32cVGOtvm1eJ-Oem3zXXJcg.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4803897913415458139.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EG9S3INeJEZF96gb6DLARUMLwTMqnlGL4K_bWZi11lU.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-J2t632B4xC7zCHnoM-WeWptLjsWCNZPcgUj_cVFplRY.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Jz13V_kGn1tEoPTiVW-pC4CHfC8uQEc5Tq-1tymHs0Q.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4pvIIIYkbcnirf6vn-sjnO316MHaDaFJMoBKjKRVQDI.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-MOJ3ADi2x04KIYrgtNGdryUvg-ERUSXNkdO-rapNT1Q.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-uflV8upujO3g4HEXXSbo7S7mXCSV-EBpJvgDnbywYjw.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-zWqKF4T_lYePeQ8kHew0hg-loZeqDglGuZTqHKZ_-Ro.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-G_Vn7L1e9oWqcZF0cJk2uPy4hIkbbYRdfeQBw2aR_iY.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vvGkoTrKNHuT1RV5B8b7zNuNH1gQ_olvMqfYhgXD88g.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-2cnyrroKNNyneFxjzDXYuviFxeWYOSdIxbLYCYzyDoU.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-mCb7siRw_SAKgdDgaDG0N-Ge482QFDcUGJJQpiTbFAI.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-hJ5dWcLnJG_TrYDLHidcEAGQpb7bYzVMcSS9C3m7cSM.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-XRqo8u_HG5iLU8uD6gTCSNm--UDm2UK8w8cf9YfNG1k.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CsQ7yRj4ozBgApEtCnkwIIlAGWD3kakuZ4u_rSkeSJw.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT--uz2r0lNP-3ZUtMf7ijy0GeV2oICrECcaEp66-ZKeio.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-p6eOMuEPPsUPHjL7Ar4vkQUECsMRY5oA4pQA1MVfsoU.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-PlUMQyRmrTx6l35uk-bi1k__ZKneDkg2KJSvOL0oAn8.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT--qG_St-PKea95tEd2UDIUFD3T1Pm2FvoGbIZJei31sM.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-SilT80-aCmQVgPTG1xXRebBWDCUOWqHWGj7nDb2wZWY.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Bw9XQQ2sAVwFohLmYlMhfB4upspyXaSBmb9wjlBj7JU.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3TXHfRwrInuhY635J_R0Wtl8YLs0p-up2w1pwQ_VNOs.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-F1rBdMbBM5Ssc0gG9uUstf_OekZG632jjF1-6kayGrs.jar
    Aug 14, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-86qk1cQny2RYFE09gTa9QW38ZAna_PF99TFVLYJ6fTc.jar
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-JoIFhZFlISi6IvVpf9g6gHGPzcUi31V6s-mrXYynQWU.jar
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-wRMpF55qXHNuMy76Dg-XWgjXUC7VrgyXzy-B1n3dqfw.jar
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-W9WCsD6ODf0Jd1e_zg51xUuBahHSRxAuu5ZhWg1AaZo.jar
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-yqOpQ0QBTuFhuTfoFUhCzLFlaub9gWyoLMS7sAH8KwA.jar
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93836 bytes, hash c8c93460ae3d96983242f3b5d86466d810a3fdea08e4e80f330a5ca01da841fd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yMk0YK49lpgyQvO12GRm2BCj_eoI5OgPMwpcoB2oQf0.pb
    Aug 14, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 14, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-14_05_45_18-12716933617989749269?project=apache-beam-testing
    Aug 14, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-14_05_45_18-12716933617989749269
    Aug 14, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-14_05_45_18-12716933617989749269
    Aug 14, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T12:45:18.281Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:25.594Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.437Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.478Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.517Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.612Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.641Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.674Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:26.700Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:27.208Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:27.284Z: Starting 5 workers in us-central1-f...
    Aug 14, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T12:45:44.217Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:45:49.799Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:46:11.468Z: Workers have started successfully.
    Aug 14, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:46:11.500Z: Workers have started successfully.
    Aug 14, 2020 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:46:42.318Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:46:42.512Z: Cleaning up.
    Aug 14, 2020 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:46:42.598Z: Stopping worker pool...
    Aug 14, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:47:24.248Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T12:47:24.294Z: Worker pool stopped.
    Aug 14, 2020 12:47:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-14_05_45_18-12716933617989749269 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0d84a895-6c6e-4505-994c-9361af23ae22 and timestamp: 2020-08-14T12:47:32.792000000Z:
                     Metric:                    Value:
                   read_time                    10.453
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 12:47:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 28.184 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 17s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fgisqvwcejw3c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #869

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/869/display/redirect?page=changes>

Changes:

[ningk] [BEAM-10545] HtmlView module

[ningk] Use primitive string[] to replace Array<string> type

[srohde] Fix bug that evicting computed PCollections was changing list while

[Valentyn Tymofieiev] Fix Py3 incompatibility in stager.py.

[noreply] Better error on BQ schema parse (#12549)

[ekirpichov] [BEAM-10702] Do not implicitly decompress artifacts

[ekirpichov] Adds a Julia Set test on portable local runner

[ekirpichov] Address review comments


------------------------------------------
[...truncated 293.62 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 14, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Bm-VvqGFGXkLrXL186eLT251eGnsSseTo7UtFq0jcSI.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-mtQL-OXJB7yBy-wwl5XpYp_TBJc9BWiZwW57sw2aPtE.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-1uX_VSz9JkDgoQ81PW7GD14--Ej8z4fjLRSYDngS8fk.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Sne1JjKlohQ5X1at4PgRmTlqwn7S9tDWpyukCUs32vg.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-M8TyLI4CPGUzSE7OlPuibwTR-lc-elPfNoJAfABszwY.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MKjeEG_QrBy1TQ9vHRVyPJ6bvNFU1Y_QI69fR_e9tVA.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-dS5gofELpAmXmOocUvEl2sU-9oHYuNBDgFM2UGF0gs4.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-N9d_MVI3tlALdMAMrq71hGANG9xtwhPbI8iQn-EFRCE.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-0j0Ol6IQMymhRH6NH2iBld4g6AWrljE2vJhc_70vNtU.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Bm-VvqGFGXkLrXL186eLT251eGnsSseTo7UtFq0jcSI.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-iEeu4-4A2fyrfIcPU9-BJEGpvA7YhPKuvigLdne5yIw.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CoiB723dm2Na6_KfJaZ4YOHaAveyiRzXH6HLtHAMWAk.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-3j8v_nX55CfROiW-lGc6MPeuFfoDYC7jp-ITB2xnOFc.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-2gRMcBxZ1TVYGUpNIHqsN5r9hEUOOfiMv6y3yhiKHsw.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1166200125965666277.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-oLvm3mwa2sVDATVouWkCYgRxqHVrYO-vysBDAciu_fk.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-4aEeMcRdZPI9kDX8Bb_WZ57HRlmJj-nnTFGoG2k4Tv8.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-nXzwpGRPS-y33rbbIxqCCBQMoIUyA7i-SHWohpypX3A.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-oxjGxVeGfgfEpHlUZ1Qis0iEDIvWS_Ap0IRLHg3Qaa0.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-asRRNnZ2J_e7IvzF2nOUlclMxBc6heCncOrHpsiQhxM.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-l-uo6QsgCq--ZElMEN0Fp9tduxLu8xjhnoNWgf9xQ3M.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-CXq2V-dhSay7B3zN04shu13ztVMBObyKnHIzDtyyDl0.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ShBqXsKJ5xsChcA8nLv0QIXTns20LiXv_oRGjdxumWc.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-M45LOtEP3Vy99ib0dmvclqyl3L6o95SXLjY7DNexPuw.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-HMLXV7cv0UjaZxdxxLzuwQ9XrMCfzz-Qa4qMWi5iSY8.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-N7TWpUDqScamBHj28dGGuZekYgELY-TSlgYJkC4vUgo.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-t8gfKMWcSdXPbtn3s4_z9cPETb_lWSrNJ7OOtVtFnac.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-a9ZfWtIMxzbHklFOX46vHbErhYikLrkMM8kPU5apzmA.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-MdCb_zVf7mVDnUOoKmofi0qVFj_ckTU_5KhadiU6Q6Y.jar
    Aug 14, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-TgIL_n4Zr_zf-J8Gg1nmRghSO_aN3l5vovQ5SEaU6Xo.jar
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-U55_j8TbK5jlopzhWjpDo5rrcrZ4cUYjazK9BpZzepg.jar
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-7V2sz5CS4ne4gCIq3mQyox47d8TsuYQ9dmCh53Nhgaw.jar
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 41fea1d7e3b8d47dfda59109875c08e0b55fd9cacbf966872b03435426c93505> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Qf6h1-O41H39pZEJh1wI4LVf2crL-WaHKwNDVCbJNQU.pb
    Aug 14, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 14, 2020 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-13_23_45_21-376621209340413784?project=apache-beam-testing
    Aug 14, 2020 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-13_23_45_21-376621209340413784
    Aug 14, 2020 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-13_23_45_21-376621209340413784
    Aug 14, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T06:45:21.144Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:35.536Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.268Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.308Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.340Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.491Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.513Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.532Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.562Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.901Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:45:36.993Z: Starting 5 workers in us-central1-f...
    Aug 14, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T06:45:49.710Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2020 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:46:04.963Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:46:22.420Z: Workers have started successfully.
    Aug 14, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:46:22.582Z: Workers have started successfully.
    Aug 14, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:46:52.342Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:46:52.528Z: Cleaning up.
    Aug 14, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:46:52.616Z: Stopping worker pool...
    Aug 14, 2020 6:47:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:47:36.985Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2020 6:47:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T06:47:37.016Z: Worker pool stopped.
    Aug 14, 2020 6:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-13_23_45_21-376621209340413784 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ae483b7a-6e76-426b-9a2f-695999218976 and timestamp: 2020-08-14T06:47:45.027000000Z:
                     Metric:                    Value:
                   read_time                    10.744
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 37.745 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/b7wymiatjebia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #868

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/868/display/redirect?page=changes>

Changes:

[tysonjh] Update nexmark dashboard links.

[kcweaver] [BEAM-10646] Remove SparkPortableExecutionTest.testExecution.

[kcweaver] [BEAM-10646] Don't wait for test to time out if pipeline fails.

[filiperegadas] [BEAM-10648] Remove unused BigQuery queryTempDataset value

[noreply] Import WordExtractingDoFn from wordcount_with_metrics

[noreply] [BEAM-9547] Lift associative aggregations. (#12469)

[noreply] Merge pull request #12427 from [BEAM-2855] nexmark python suite

[noreply] [BEAM-10500] Make KeyedTimerDataCoder encode output timestamp (#12535)

[noreply] Extending ApproximateQuantiles functionality to deal with non-uniform

[noreply] fix logic issue in metric name namespace filtering (#12570)


------------------------------------------
[...truncated 295.33 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 14, 2020 12:49:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 12:49:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:49:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 12:49:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 14, 2020 12:49:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:49:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2020 12:49:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2020 12:49:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-mYNmDi-lpqz3b-umeWZS18sv5Mn1Wa0XkZ_-ieYVXwE.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-jwcM37QFOPC8GG99aMAXW36FLqF5jxQTg4zsawJzxYA.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-7xV5RtMzXePEKm36rz4CsYBvWVfNCdHoFNCDnX_D9qE.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-zj4PR7Ds0_yyPM9FSR89PwtSPHQLCl2CZYC1TxAE2UY.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-WQv2V3FJl6H3sqXInzPT3GOLXqXVYR2OMkRt6Ob_Jvk.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-mYNmDi-lpqz3b-umeWZS18sv5Mn1Wa0XkZ_-ieYVXwE.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-DLvBzRzDpILfBIFqwvy6Lb-yOa8l1vg4fCB3xeQmhu4.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9yn-6_5_GN32vqtwhZJ4uhhN_oWH-uZWQhSuikVe-2s.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Up8VmF4xHbKxYHGz3DCdRBqISW9x13Xa7eDntDYZPvg.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-f5vfHe9bfqzM1mnAhsU8L5QiJO017RjqtziZ9LOykWk.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-QFSHEqeva9k40Kxe1j7EfhL0X5BuSya-DCEDiboQPUA.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-2IQYadD-co5ll5iIoXvNebqLYvxiHKtfT_alTSvYdWM.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-j-rVEmiIc8LhwTsDK4a-BaP1-QUgZDkLxf8HccHkz0s.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-8ge_715ksQq1Bbbyg3aldn3u5pXaADdYu4_2TgPQOaI.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-x4R_sLe9O12YR0x5b1IeIMzePFn_JiFX7_-aNdrV1gg.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-guj4ENZSemZfC0i-LETvl1zH3iPlC4CnmpSi9llGCkA.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-4RxJCfaZmA9B_R2I2MLWMM6MwoCEY_lr3aLYhQaKk0U.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-k1aG61uEDNDEMfPiXVqmeeY0Slg_6ObEkI8xfyOV4YA.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-d8x8gVcttq50MeHONCdN8CRkEnWRRJG6o8fWqeGfGEA.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-851_9Hil-s-L6xtwHBBza9PaEP2PrrGZfdUKzCX_wk0.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-lb2yGCl7Oh969Th04nmm2W3SONJrsxEcmAcAijiU1rQ.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test18263546164170442.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ruBwcMQYQss2ZJfVGDk-y0l5_WVK60KtNCAtoGYODJI.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-mX13UGeqZlbcC7PnW54xiSJ7nrWKMZfnEhkYlk9Wnds.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-k7IcljzNZzWm-bZbC_Pbsp-bgOi22MWEDoNpT0ruBU0.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-yhAOtUEboccVKstW3KkbvMk2WRn_iiXzY0zM5jFDSDg.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-HQrvy0OLwHOoGRi9c6hTIjaPtnzy-C4hVh_RgGkA2cQ.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-g2g86qsCZL-fe9EHhguj04jvxqXgoJByCHUr-1HcJRI.jar
    Aug 14, 2020 12:49:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-vypRuJtavCg2iL-9L_K5am51v3qPYhvueDwhAtil-8k.jar
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-8olAr5-527WG180VsaH74lY6xpN64FmJSEO0-W7vK9s.jar
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-FvINdcOB2Y9eKVopfHOQLC6_D0OmC1ufWho5IBuRatQ.jar
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-5wqbTwtejajxSeN_MwE2HAMicwC8ooIJiNrvMb3eAOE.jar
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93832 bytes, hash a41a6591551a576d2569f3eb0c1b0fa1b9b9add4c59eff2ac9a5e486dc0eb912> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pBplkVUaV20lafPrDBsPobm5rdTFnv8qyaXkhtwOuRI.pb
    Aug 14, 2020 12:49:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 14, 2020 12:49:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-13_17_49_25-7279414762491469622?project=apache-beam-testing
    Aug 14, 2020 12:49:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-13_17_49_25-7279414762491469622
    Aug 14, 2020 12:49:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-13_17_49_25-7279414762491469622
    Aug 14, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T00:49:25.309Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2020 12:49:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:34.065Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:34.794Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:34.845Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:34.954Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:35.050Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:35.085Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:35.130Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:35.151Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:35.642Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:49:35.733Z: Starting 5 workers in us-central1-a...
    Aug 14, 2020 12:49:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-14T00:49:54.562Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2020 12:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:02.604Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 12:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:02.632Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 14, 2020 12:50:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:08.007Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2020 12:50:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:25.504Z: Workers have started successfully.
    Aug 14, 2020 12:50:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:25.542Z: Workers have started successfully.
    Aug 14, 2020 12:50:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:57.616Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2020 12:50:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:57.826Z: Cleaning up.
    Aug 14, 2020 12:50:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:50:57.919Z: Stopping worker pool...
    Aug 14, 2020 12:51:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:51:50.531Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2020 12:51:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-14T00:51:50.589Z: Worker pool stopped.
    Aug 14, 2020 12:51:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-13_17_49_25-7279414762491469622 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b6fe715d-c740-4062-ab50-c3135d633e11 and timestamp: 2020-08-14T00:51:58.986000000Z:
                     Metric:                    Value:
                   read_time                     14.54
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2020 12:51:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 47.829 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 58s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/rakym7nuso7cw

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #867

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/867/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9680] Add Filter with ParDo lesson to Go SDK Katas (#12506)


------------------------------------------
[...truncated 296.95 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hFGpbwdrX4i4d9sGXSS-q4NqLZ_Vb5aG5p1rsOBEuqc.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-zeQjRywV9AxfEKbl1c1rtBwHPfEUCRSmLzl1Z0_uxlw.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-zHVfbHWVL9LmaGrMX57nAkRN6CcE10BsB5lO0MORePo.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-8FZV0vRznxw8l3Iu5Hs1wrErQSi0FrI5j0kk28TX7y4.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-RnIeBHEO9ATatw8HKxAJp-wdISsWw1viOANHWKNNGII.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HaFKfjJfLOA2uPqTiMlrXAEoCWnFrsj4RjDX0Fl0VGU.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-YfubInNDwtdlrd7Xw00de2uOVtJ63-sdStTQGaggvDY.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-7bxi0hX2aF8B3F3oUG5vwPE6Yw7JNLWZX7tck87Ha9s.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-bp78Bt4yN3lQNCzleMgQ7RzPrTLWlc91DtOr-CKT2WI.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-BbZAjG75T8zaVkfj9LMCJSU-vex2PsIPC_bcDJAUESk.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Lo-Wuh3TVz1FStr_Rm0jNMhctyWS7tO5dyVPQuBqG5U.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-N5WzmsnwRJPS7CkieP4UFJb8rLjrAOlOSyC2YVdlJZg.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-C0KmHJ_b_PvoLUzECIi6NDRh9qgTiYSjs_HnTWd0CAE.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-5jjHRicpMnWMSAiF0ZAajghiufryz4iKr2pTbrR6XWs.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-0K1sYCElHY4I4o0yXILFGPZlNC0YtT0mbVgXjy_3RnM.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test806880648525958686.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Rkz5eWXoSgvOT4GaeKBYGePjXIS93Ube5N_3PXkX1m8.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mIU4oVsrJHuLzkCxRZeWYRjtONxd6yAmL-_BwE5c758.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AZdhEahqCf8qsGysuzqTXuMfs4Q3b8qHLjO1K_H9GF8.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZFgWkZtzqL4DnVq9ilmLaBlE9TNzHtO2ULUO2gPKzz8.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-hFGpbwdrX4i4d9sGXSS-q4NqLZ_Vb5aG5p1rsOBEuqc.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT--kfvVJpFj1_1yTQ7jZ6kH53dA-pZFSQezpaLfBSuKww.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-UQL0LqcrwfuxpkvICJBcxpxgiAMJz6Y9iQVdtKBHn0E.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xmgxzPgeKn9wDZJWLa5-CN__85u2G69lSG4FJBjidxo.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-KD8E_k5CZE56q7n3o7eVexagnWeqYM8oodPt15NOKxU.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-pMzWoKqrx9dQ0owRS-n72eeg3Lx3J1olbJKeD-cOIWs.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_uq6tdwrdemWOeUNVka2rZu2B1GapDyeRGwNWKKXNeg.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-DiaR3_-ekwW_GVXEv3sjxK2gsbacjun7regihgqVjQA.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-uPBQrUijE9xdPFoi4_F2ugLKyTZ7OK-KvZ4ZRXbZYHE.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-VB4baMqd7-nwCy-4iarpeOqLYmCnd82u97eNTstvMbs.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-M5zvNatP8Z7-4RFfygh7VGwBDmPbw1ibOPNKZBjsTTw.jar
    Aug 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-obnomzZO_Qb7PdeyGf8eQ7g4Tt9-_bHLJDzxGEvCJRo.jar
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93832 bytes, hash 9e954cbe2ac62f0f7ffd70e40eeb34c0eed87899702fc30a601350c5b563963a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-npVMvirGLw9__XDkDus0wO7YeJlwL8MKYBNQxbVjljo.pb
    Aug 13, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 13, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-13_11_45_34-79948279913845866?project=apache-beam-testing
    Aug 13, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-13_11_45_34-79948279913845866
    Aug 13, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-13_11_45_34-79948279913845866
    Aug 13, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T18:45:34.718Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:43.861Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.598Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.636Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.666Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.741Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.771Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.805Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:44.840Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:45.171Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:45:45.253Z: Starting 5 workers in us-central1-a...
    Aug 13, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T18:46:08.147Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:46:12.943Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 13, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:46:12.971Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 13, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:46:18.424Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:46:33.573Z: Workers have started successfully.
    Aug 13, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:46:33.621Z: Workers have started successfully.
    Aug 13, 2020 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:47:07.866Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:47:08.125Z: Cleaning up.
    Aug 13, 2020 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:47:08.204Z: Stopping worker pool...
    Aug 13, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:47:59.304Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T18:47:59.345Z: Worker pool stopped.
    Aug 13, 2020 6:48:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-13_11_45_34-79948279913845866 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3cb568f-08d5-4c28-90ec-71ce42b951e0 and timestamp: 2020-08-13T18:48:06.408000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.423

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 6:48:06 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 45.429 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/fddgc3vkq64yg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #866

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/866/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10694] Work around serialization issue with ReaderContext by


------------------------------------------
[...truncated 294.87 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-jsFGfqpk4kzLX-506YApnx_BCshvjXTQA4L62olGSk4.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-7d5t6EbadwPLtnydIkKdHVAZDaosind0OFgVvm4y5dY.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-BbxnXRdaqxb2F-0wONEMJiSFDudHLwLWad6dxcWEGUc.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Cqiol86A1wRAcoe0eldrBkIgqtGwmcN7wFKjRro8GlM.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-UH8_GnF2FYd1lJsXev-blYMUthuchiKUBsMkQwQZDBE.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-JWyAyCZ591LjpAdZVGUhH-qC7ZlPqGU-09U2SKuujiY.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-sFTFTR_OVWoFBipA6zNjymSOsSGvbb-pIlrYTL-MpbA.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CdouKtoMwM3HSAAb0vGN2okrwVmIdK8f4bNoFdlUq4w.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-jsFGfqpk4kzLX-506YApnx_BCshvjXTQA4L62olGSk4.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-GBcZjM_ZQZWIXq8Sl-wXJdDy4K5mS1B3VMZTEcmHSbA.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-w0FZ18FmvG4KS70p1reD39CPWvSdh8-9_UBDopb1R7k.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-PL_DnR84pesvbIpLEvvljfYHIVyVrnNZRcU6gA4oo1A.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-jzWU-J4myI27E53BxD5dttdFiaKjBaFe2_AE98O0GWg.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-qOhkPGb91R9jvQQR1S5oLdUwgi_QrwKZnddvzvMFysw.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-aBDCupwLqKhoICdZIGZdrdxbYK-mecEmQNdtddUGZh4.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6phxk3NeXQ_pfZWxnVfH6eEsPRlzmS-zTeQLY3OMloE.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-YlfEvp0QqG0Xw5nOm6XTTwc-6s3tzNSk4CiJtpXFGlc.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-aexUH7x_HUcUDXiULo4IkM1uJCuE9kxlsE2NvxIMZpQ.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-nqD2R9Wfd9JESR_b5Nr00dTOcMIugQCDVSDGrMT9ZNw.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7789166319056592288.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9XnKFGk9M2RGLKenfJTJYaWMlmL1mhrSsfuN6MvUMUo.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-QL-tLAlfXZqAzfpFBxowADo1iJeM3_LyclNq6e4M32s.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-03fr07NhID6vxppv5mvTSN4GvPiV79avFi6bNZsidxc.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-N7MSf8ISrqz7DMFipuqpv4bXUGuK9eoXOcaHZ8ulNbc.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-U04IlLFcXDlNmygllSR6ff-HGbh1vhROSUg8Jd17O5o.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-R7PdsYMKbB6WWB32ZuqA_7DLX1PsoZ0r8Eejg1c41D8.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-o4sRgp390W5NE2_re40OHHRV9Ydnv8R9AbAJlbICLK0.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT--Hev-pfjq82KpMdqjwDyvFkDJhzCtaqSG0VaWOxJItU.jar
    Aug 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-AiDAjPe1vkWTw9yPvNC0Fu-eHL1qrAFsGHgdhgb1uQ0.jar
    Aug 13, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-K9r1Zjyx568pm7SN-9TeFRgqx3QDRSm7Dnvv4N8xXm0.jar
    Aug 13, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-mqtQ8X0UOjkyxUCE2Rm58C_e88_CgJTIj4zLlWZ_oCk.jar
    Aug 13, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-LpbI30LNWWZ4wTEMKeOCFimT_cMr17RzFBfwqtPBwaM.jar
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 1469ff8378f59b059b34077ebc78472a09d74077ec5e177c5bf70188d132ff3a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FGn_g3j1mwWbNAd-vHhHKgnXQHfsXhd8W_cBiNEy_zo.pb
    Aug 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 13, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-13_05_45_15-9557858453225778052?project=apache-beam-testing
    Aug 13, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-13_05_45_15-9557858453225778052
    Aug 13, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-13_05_45_15-9557858453225778052
    Aug 13, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T12:45:15.747Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:24.621Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.365Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.412Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.443Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.516Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.540Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.575Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.599Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:25.948Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:26.037Z: Starting 5 workers in us-central1-f...
    Aug 13, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T12:45:40.626Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:55.742Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 13, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:45:55.770Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 13, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:46:01.153Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:46:21.227Z: Workers have started successfully.
    Aug 13, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:46:21.262Z: Workers have started successfully.
    Aug 13, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:46:55.413Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:46:55.717Z: Cleaning up.
    Aug 13, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:46:55.802Z: Stopping worker pool...
    Aug 13, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:47:47.436Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T12:47:47.495Z: Worker pool stopped.
    Aug 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-13_05_45_15-9557858453225778052 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9743347d-bb3d-450a-8f77-666bd71f0d17 and timestamp: 2020-08-13T12:47:55.573000000Z:
                     Metric:                    Value:
                   read_time                    14.979
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 52.989 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/avmulyfdaxdbk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #865

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/865/display/redirect>

Changes:


------------------------------------------
[...truncated 293.62 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-wnaiaAk2D0Uz0VbNGf9aPkrEDrcrlbbyYxyGU4CwufY.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Z7l9c6wxtqJC40_fMS31xL9F3b4rI9SCfHVNmhNCXv4.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-UfuibdshkFt41mdlMPEdasg2nY1JAf0a4-g--JYVw4g.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Zvl2-PcSLjz9TLjXsdmJb3DTEMNkM5pZnmGhBy4WWmA.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-wnaiaAk2D0Uz0VbNGf9aPkrEDrcrlbbyYxyGU4CwufY.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-fwuZXPQU9XsmxD8WnX6EkSUjtdcvdj0bb05NtcuVl9Y.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nnyVgReIEX_M1nA74u7egWKJXmXNSR-JY8qvpGRfJgg.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1044720405599946238.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JdMhqitTE7_AZMUVdpeEaIM9LuxCvIZbpVf64CZ1ahU.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-S88VahTtuNq6xmufiNOVv-xjvO7f_xQ6dZuZg5czTLM.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-KkXCUHN7VfoqnbKz4GkSPQ-om9WT0W0hhCLG69tygdo.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-spH8TZqMR8kNnZ3pvVoEkqsuDrgvOJKLiWsfyE3iYDk.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-T8z9i9ZTca-omh5ChW1k45c7gqmMcScJKjuLjRBgm6A.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-AnHnE73VxXgsfAwQQhfSj270HDePo1CNjrSwjCIMxIE.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-3rodrTEtBPqipyBU_KtIcO1OTtTK6PmfOqPz6ViRDt8.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-n4c_M-lUgLudmYVA7NtCVZE1XiJWtRkPfNDz7IjNw0A.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-FskmuTsji7hr6Zy9LgqzgOchlpfnL8cwaDCmbdMol8c.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-BmdE8q3EnlpxdO28cn2tualSCFa6kc2heH3mE0awS38.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-TgN7rFi_1AuTE8w_ppBZ4hRQU7hjRgFG5tS-BIidexU.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests--TPdv0deqqSKy7-h_b1OyNVTXKf-fSt4nAlr7liBEHA.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-YAUhKcBZw1Cg6v3QsI0dSYawNmbVx_uQTcGF-gMsZE0.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-YKVphvcIe4jiUgp0UXh2J8opSkl1u1jujrLJ6JsallU.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Dlaszm_EmT8PCrGxCKDyNpjr2-JNMi0H_SmgW0s0uec.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-q_dZJtu3dpn8wxWk5QEYtekkCSipZ-HcaHCBmeVP6eY.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-VPQGbh6GSka8ymp7-9nSMx-ddeLkUdNsaceDNVD3F4Q.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-chHebSAhXjfbI_QdIoKAVlvkNYkJvQZAomMMQsap96U.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jr3L9HZuVnUmHd-txrQsgugAccQ4KJqGRE3pkp9ZqZ4.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-LMZ9YbU5RHamwVEWhX1AwN0L0E9xXUmA0NPKTEuGCPo.jar
    Aug 13, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-L5UgAfgctXsWnUQ9LaPP3DASVWGfh12gSCTX7NF_kxM.jar
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-NKLzv9I0qL_ZsNlFykFMxa_k05jH4lU1x9yO1FMugL4.jar
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-0q41EeQWNZeukgsq975OYvXLTIDjrhm2sXdCv69vMZ4.jar
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-GXjtmM6HCTG0Uaz2ZIKo8i3jaZjG0GaKrq9rx3cA5d0.jar
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 5f99ebe5bbd640f22319cf4171c61cb6f4fd4bc2b7e35ef12aea660eb3ca6ef7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-X5nr5bvWQPIjGc9BccYctvT9S8K3417xKupmDrPKbvc.pb
    Aug 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 13, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-12_23_45_17-14362847959883666158?project=apache-beam-testing
    Aug 13, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-12_23_45_17-14362847959883666158
    Aug 13, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-12_23_45_17-14362847959883666158
    Aug 13, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T06:45:17.177Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.049Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.683Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.723Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.758Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.832Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.862Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.896Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:25.920Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:26.321Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:26.400Z: Starting 5 workers in us-central1-f...
    Aug 13, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:45:49.223Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T06:45:50.865Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:46:10.342Z: Workers have started successfully.
    Aug 13, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:46:10.379Z: Workers have started successfully.
    Aug 13, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:46:51.545Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:46:51.764Z: Cleaning up.
    Aug 13, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:46:51.849Z: Stopping worker pool...
    Aug 13, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:47:41.711Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T06:47:41.745Z: Worker pool stopped.
    Aug 13, 2020 6:47:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-12_23_45_17-14362847959883666158 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f0b177e7-6ae3-4a69-905c-0466cad1083f and timestamp: 2020-08-13T06:47:50.676000000Z:
                     Metric:                    Value:
                   read_time                     21.65
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 6:47:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 47.435 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/rrbdijcf74pwe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #864

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/864/display/redirect?page=changes>

Changes:

[srohde] Add max count to utils.to_element_list

[kcweaver] [BEAM-9558] Remove usage of empty data/timers to signify last.

[Maximilian Michels] [BEAM-10676] Use the fire timestamp as the output timestamp for timers

[Maximilian Michels] Use input timestamp as the output timestamp for processing timers

[Maximilian Michels] Add test

[Maximilian Michels] lint

[Robert Burke] [BEAM-10610] Clean logging cruft from loop back.

[noreply] Follow the same way that BigQuery handles unspecified or duplicate

[noreply] Merge pull request #12489 from [BEAM-6064] Add an option to avoid

[srohde] Fix BCJ to stop caching when the cache signature has changed.

[daniel.o.programmer] [BEAM-10289] Adding required transform ID to Go channel split.

[noreply] [BEAM-2762] Generate Python coverage reports during pre-commit (#12257)

[daniel.o.programmer] Moving to 2.25.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 326.84 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2020 12:50:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2020 12:51:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-BhusfQmhpdHB6w0TUfhSgcG945hA0pWCRIv3_g2u6Rw.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-oPeC8IyjcaEGQK_pV74w1tQYhbUpxHtnONwVkFnPMYU.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-INQTsY6JZTvf60EO2NFU3kfX0nR-Zsw3QcVzcLW5shA.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-F3Lku3SbiBrgBsdf7oB6tM4mQNjUOsvCtZKuMoAoyT0.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-AM_QHTvKlxlEcSiH8XET7b3LovM4DqxqR6KJgm-nPUE.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-JLFuYO4RtsOPcaigX3ailU1FB99JULU7O4alK9pY0iI.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-vHM6D69p9XQX_-EhGW4RGFdF9cDah4uQPZynXBlj2SU.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MMVF2nk_9HvFY1xOKGyOGvkHWug_TGgZAO95CXEOG30.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-i50jQjuZmNTOILQY9IOm1qNg3pHsp0CDhZRc5PRojr8.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-HTJY5a4MxGaBUQCBh49L-Qcd9NGuFKqoJe4QUxYudSE.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-d-e1Mwy8qcZGxzthja9tHxN3vPKvNmQ5G-zMICE1tMM.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-3F6j8BT2_RBtWTRV7KKKMtF8kO2f1G_GhjCCUf3wZUU.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-qo_OX5kcVRtUa8iiu3rrrZC2gVUUOh93PwTg8MIqBi8.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test642613124378879995.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SFoYHFYksePNp4lEKeAxUYOJYXq_oZZTTFyXj-Ylqig.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-4RZLDUyHBw7UwBgEczME2fI7W85zhhfThN8SbemSihA.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-l4483x_S_6-ZysP0Q-RpcJ9KVM25Xx_9Z7_I0sZ9bb4.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-BhusfQmhpdHB6w0TUfhSgcG945hA0pWCRIv3_g2u6Rw.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-9m6jTZcBvjjxNc7ShmdRBWDa6t6GMzN-l-x4_npWq_I.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-h-MDjZYqWgaopoqFA9OLPoatTwJiw0bW4oTbkwzCqqg.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4aN0czIRJyvtmWljH8kYGKB-cJEx6kuZ1zHa4c9b-UE.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-upRcnP0c8LcYoVHND8-TVjUeXBUOy7xnuBDHUnjer8Y.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Z4L8nz_kVKMBo_JkM48CAf1v7Nezv2tgMDD2FjOzQRY.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Cu2v31jA5KjRJatYWwgNqpeYkywV4MwXywGePfxtjYk.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-JKraJ1wTqilufJZ8A6oKQOGEsD2VOJy_dLhDGQRsTJY.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-9jp5lerjrJmhQ_RGexR2HBTGJlOVVuHK_yuuisgBJi8.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-F3ykaIf57fKSl2S6LtW6NYoovHr02vWCxw6CTwpfKTo.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-fDgjLo2yQPdqVP_0XHsRQbm1c-_v3SyPs6yh2NSQMtg.jar
    Aug 13, 2020 12:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-4tEJ00fYo-hRYVp3iXQXQuwEfRJC5U2K7V5WWm4E9ps.jar
    Aug 13, 2020 12:51:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-6yl5ATihxpyQDvc2rfhUTlDRVcSfMfi5tEciQkH3K1Q.jar
    Aug 13, 2020 12:51:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Z671k3iBHsYDaOisgREI23D-ZaD6MHk5JgDX_m_AHZc.jar
    Aug 13, 2020 12:51:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-jb3mY7sWGzwv_b4MUSpwP9vyWKxKG99z2grJUcHWyBs.jar
    Aug 13, 2020 12:51:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 13, 2020 12:51:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2020 12:51:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2020 12:51:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2020 12:51:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2020 12:51:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2020 12:51:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 0fa3eadcf893ea4e966bea5dfc0cd16a3ae7806e62948818ceefe970a6f6cb5a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-D6Pq3PiT6k6Wa-pd_AzRajrngG5ilIgYzu_pcKb2y1o.pb
    Aug 13, 2020 12:51:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 13, 2020 12:51:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-12_17_51_05-14264271405221323141?project=apache-beam-testing
    Aug 13, 2020 12:51:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-12_17_51_05-14264271405221323141
    Aug 13, 2020 12:51:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-12_17_51_05-14264271405221323141
    Aug 13, 2020 12:51:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T00:51:05.592Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:13.566Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.376Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.409Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.454Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.544Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.572Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.597Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.630Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:14.952Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:15.015Z: Starting 5 workers in us-central1-a...
    Aug 13, 2020 12:51:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-13T00:51:28.032Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2020 12:51:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:51:41.992Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2020 12:52:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:52:00.822Z: Workers have started successfully.
    Aug 13, 2020 12:52:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:52:00.855Z: Workers have started successfully.
    Aug 13, 2020 12:52:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:52:36.626Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2020 12:52:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:52:36.824Z: Cleaning up.
    Aug 13, 2020 12:52:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:52:36.912Z: Stopping worker pool...
    Aug 13, 2020 12:53:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:53:33.975Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2020 12:53:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-13T00:53:34.014Z: Worker pool stopped.
    Aug 13, 2020 12:53:41 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-12_17_51_05-14264271405221323141 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e55ac3d8-f20a-4f2d-a9c1-b6a91ae4558e and timestamp: 2020-08-13T00:53:41.815000000Z:
                     Metric:                    Value:
                   read_time                    15.726
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2020 12:53:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 13 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 52.383 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 59s
106 actionable tasks: 105 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/pwnawirspmazm

Stopped 12 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #863

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/863/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-10681] Set metrics supported in Spark portable runner.

[tobiasz.kedzierski] [BEAM-10686] Simplify names for GCP variables check in GA

[noreply] Fix format string in PipelineValidator (#12522)

[noreply] Fix some typos (#12539)

[noreply] Linkage Checker 1.5.0 (#12545)

[noreply] [BEAM-10684] Fix jdbc cross-language transform (#12543)


------------------------------------------
[...truncated 293.02 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-CtQQhzvfu96DCZ9r4DUhUkAv3rWO2kH0Nnhk3G0glYQ.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-sz8nSKzoyWLCDmDikO-u5PBqRZ-TNjD2yv3YwWqLiAk.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-16gYeUyhJPBX1Ne49ZzcYlztzcDDS7uK3koQqfF8ZlU.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-nlRBrMfXyHVZQTH5-JBMnRz7WDJRvxcFpsWxoq7q6tk.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-J06zhkbM-BDCxxdTsIp3AGkZOtPBVcPOfgiwms06Mn0.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-2jUHJp5hMOy2qm-IJ5maxpmGEus_OW-YBOL8KLPTJxc.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-32psDiAjS8ucPT1KfIQ29cjmwt1hJrpuSrmtOUYWUYo.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-lWOCej2l5ROuqOXGgfubUxYN0aCtXg_hmEH0EY9mh6s.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-v1iV1vJyjmzpcFGOQtGDevJWV8lvuV_femTrRqT_V78.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-QxBABxwSNHM7bF5UtPg0b3ks_YejdxrnWc7kiVbvOoQ.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-9mwsfF0KXQocHqLIMvpfMhHEF7phOebbkVqej8KJ4p8.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-lEUZbkKO3IV_dsk9XY8svQdEUR1xPlOuHnR20kCuCi8.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-KzuaSpvlcEAWZC_LeZaDTRk0uidTVRliYkajRXWf54c.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-W2WKpbWFtE9Q833ilb-BBqUFLgV6BnF_GUGqLXABGb4.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-vrmj8qXGgUTnXkjOqobeM9hWfPrMH6eguVfBRYdMWUI.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-9A2OrcBrYWPGz97jpnd21lvnablZIvuYqIMHU8NNDYk.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-MvTICjvgZTwmF14BBG3cy9eEUCKqT5ISE2ev-GdGlZk.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1448716550483968389.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FzWee6ZSBtD2KlMfHAyQoJlesPzLlR1yEB76P-cELKk.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-CtQQhzvfu96DCZ9r4DUhUkAv3rWO2kH0Nnhk3G0glYQ.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-6hiG-cPVttl0g-2RsJNHUqqiHGRmiwz-_qt2XqqGPzk.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-uk3F8Y6sOdaI8B-S9vffTj062mFUcwHFwu9sUzhbx48.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-ISVca0diNBEVtqHZn6OhT7vQWiuczxKcWsGeAvjwjg0.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-cj3EmKxYcYIoJXbVE-jf-hDz6a3AfkyvRCLnUWqaet4.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-L4alcwQiKIsTeYlcoywaCIX7ZarxQGNdvkFvPZyqQBQ.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-3-PDiErKbudR2rrZYGx7Qc_sMWti_PYTsOIyPm6h2-Q.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-6THAzlrygYZeMQsMonqyp9E9HQVkvfROutnKxIBrv8M.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-bt4-WkfjhZ2RCUji3tAvN8xvnK6RJ8-GOsy0HkimEdo.jar
    Aug 12, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-7iwgEh8XGoCz1_VOAwrk7SsY08Rinl2j2P4JQNuhPSc.jar
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-s7k-ZeVY8K9vxCfuj_-muZJfS1_RpwPUGZmAaDokB_I.jar
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-FgzZMYRnPTR3wivAOKE55R7Lf6pbobQPo72-q8_rGIk.jar
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-40J_sG-Sf2lqsByESCxG93UTo41r2eO153_30OakTDM.jar
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash f1d5966494d3d54200dd11befa5d8f8937e3938c7ca4bc5c3ad03bd802afbccf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8dWWZJTT1UIA3RG--l2PiTfjk4x8pLxcOtA72AKvvM8.pb
    Aug 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 12, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-12_11_45_28-8794601417156717702?project=apache-beam-testing
    Aug 12, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-12_11_45_28-8794601417156717702
    Aug 12, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-12_11_45_28-8794601417156717702
    Aug 12, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T18:45:28.469Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:37.178Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:37.889Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:37.930Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:37.956Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:38.022Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:38.049Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:38.084Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:38.119Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:38.463Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:45:38.543Z: Starting 5 workers in us-central1-a...
    Aug 12, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T18:45:49.816Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:46:09.117Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:46:26.956Z: Workers have started successfully.
    Aug 12, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:46:26.994Z: Workers have started successfully.
    Aug 12, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:46:59.519Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:46:59.764Z: Cleaning up.
    Aug 12, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:46:59.838Z: Stopping worker pool...
    Aug 12, 2020 6:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:47:52.176Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2020 6:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T18:47:52.227Z: Worker pool stopped.
    Aug 12, 2020 6:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-12_11_45_28-8794601417156717702 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ba85a6d6-e68b-404c-a91f-74aec2baac61 and timestamp: 2020-08-12T18:48:00.634000000Z:
                     Metric:                    Value:
                   read_time                    13.964
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 6:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 47.975 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/le73unupqgyz2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #862

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/862/display/redirect?page=changes>

Changes:

[Maximilian Michels] Revert "Merge pull request #12408: [BEAM-10602] Display Python streaming

[noreply] [BEAM-7705] Add BigQuery Java samples (#12118)

[Maximilian Michels] [BEAM-10602] Add latency/checkpoint duration as separate panels in ParDo


------------------------------------------
[...truncated 293.57 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-8RfIlvDqxIA9BZ8OO-_8vyBux1pls-9T6l1rw0WpPVI.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7514773294204201325.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-c2mn4rbD_57-rVf4ggQZTrUAKHYwiznGAr8b_21Xodg.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-d2WBLfry97dpvlCWODdfAd5P2efUptSnNvDxpvtLMMU.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-M30nu7dhunQfA-d1w10I8mU9FJ1OQSlNWj-GMj4ujwU.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-03l7FsdWbDsnPDESX5bnkcITpwxaPo-FKXOwWOHAndg.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-geYPelPCGlG3g2u-nKfZOfiaOZUNR_Xe4iOHN0_oCro.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-7iJyPBOQBe88ztW7SPVcF_d-BQN7y02EbwG5dlxETHI.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-6ydrc3PKR2eNIb6PKhUnllXP5dOvgRnVewQgiB7TpRQ.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-jh5wOhhVKamv85b2qJSFUg1fp4V8pw7-a9sKmu4LOlQ.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-8RfIlvDqxIA9BZ8OO-_8vyBux1pls-9T6l1rw0WpPVI.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-WJitd484wjPVbnepr0egqweOrnnQqA5Gue1zJyixODo.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-zbfR4cS7XFEovXmlB-CcAqviOdnMVq7CzmMKUHAHdjI.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-ZpMuZDl65tz_pVow7D2zkEhjsZh_pfio04zinyvpCIU.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-66Ttks-oUpZwn8Azzn2l8ajciGi96HrotLllIYdlmjE.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-j5AcK-p8RIaPdNkNdioIOkIShiEM_ql-JorJ7IQFSLs.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-gh-GUIjkfjqJa8N0GmO5JaRRzHk6PrphaxrIm7oPvuM.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-xvjUieDUetMzEfSfar8c_khF2mAZACmnj5fczkUogGE.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-uwbazzEJc4WgGpKFnB6N9PHsfL88KitHZP-VV20CJww.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-QFcjkB3q2T1VhNG1Xd7cZL0_qFT2182gRs9KU4sdaM0.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-DywtCQY-QuhUqVyWYilbHj9wU4snX5u6jGcP3h2JXng.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-W0WxWyQyUtJ3Vyt0l5hMD_Rtq8L3aDBrHL_4Q92ZSKE.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-ehyHUnBl77G2mxpZ9WuQ2SrEJP3IiAzsiYQXWECM948.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-He09908AtiQHGlW1ix4nY4n0cPEDORkNF6conS6VjJc.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-2YEukI1LQGU32eoqXjzX-EZte-f8rrHEfmHeF1Fc2Ko.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-D5u6jKk8SaIEO79zg_bXdVnyhCDufSgajokUk3-jPCM.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-It3Bh180nAPkb1Dd3v0U-92OnQd84LrtL5OhYP98sls.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-EJH39YEUyVM8fMU34MO8T25SmT8CW3pf87K2ljUFLBI.jar
    Aug 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-wLRf7zDsIpRBhp8AFhMrXhj3oWhsaDQQ1iuN2F0dOeA.jar
    Aug 12, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-tK7CndiKmOjqvADgn2T3KcqDnKqTBa2HYd9TAEevOPc.jar
    Aug 12, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-ocwRWglMrPwpYyPH9cxsC8H0pRgHtJxYGoAfSIEaqXY.jar
    Aug 12, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-GG86s-iwOBvaQZRdHiZpPldwjQ-_PDx_MYFVBFSI2Gc.jar
    Aug 12, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 12, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 1078e32cec83bf100c0a79b1aa5dbe0f6910fedd75b4baa24ec7c31555133863> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EHjjLOyDvxAMCnmxql2-D2kQ_t11tLqiTsfDFVUTOGM.pb
    Aug 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-12_05_45_18-18154265945079638000?project=apache-beam-testing
    Aug 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-12_05_45_18-18154265945079638000
    Aug 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-12_05_45_18-18154265945079638000
    Aug 12, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T12:45:18.477Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:30.727Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.653Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.695Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.729Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.814Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.837Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.870Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:31.901Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:32.597Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:45:32.806Z: Starting 5 workers in us-central1-a...
    Aug 12, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T12:45:37.651Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:46:00.842Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:46:20.078Z: Workers have started successfully.
    Aug 12, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:46:20.129Z: Workers have started successfully.
    Aug 12, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:46:55.017Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:46:55.478Z: Cleaning up.
    Aug 12, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:46:55.623Z: Stopping worker pool...
    Aug 12, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:47:47.272Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T12:47:47.316Z: Worker pool stopped.
    Aug 12, 2020 12:47:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-12_05_45_18-18154265945079638000 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2f169d1c-f824-4582-bedf-479884f374bb and timestamp: 2020-08-12T12:47:54.895000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.582

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 50.299 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/hpsv3pj3eg5ga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #861

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/861/display/redirect>

Changes:


------------------------------------------
[...truncated 294.54 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-MB5qp2pcLRW3j5CrUlbm7f4zoo1m7RPFGvI_e405GNw.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-UUcd9qHMpZQGQkoUDkoHEbTEXJKRNvLDC18OglwfTfQ.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-45dIA3R8Qo_FCyHF3y4mwvPkZQhXT6e7wycEvHhLd1o.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-rkHzyrxcqn2gIuJa1W-pshQf_ssmVGRKQZItIuEv8Fw.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-hytH8BdjAF9UcBpkcBoCSU_FW53GbPLxOrA-Va-burg.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3242361525655273189.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vzpTRjZ49fWX0-pAC_ELby6_YhWDfi-wHFz2YnG3UtE.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-6pX1mR3Licc4OlcRNsoO15hr16c3owhTxYJ80gdxVJ4.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-MB5qp2pcLRW3j5CrUlbm7f4zoo1m7RPFGvI_e405GNw.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-qhwVf9srHj4-gbBqIP6S6FgAyn7Y_8AgTMFdRbMUbjA.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-QqIQhAXZzXXkRiExnmk42VIZ7i3LyZ2hPdK9sdUJAkM.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-p6ghfMXuHO-kq7gMt60VwyX2kU27G6ioNqUaCgzzm6U.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-4QmXs-fo2fWe9TmNCRqZTo8Zclgs9GLJRQKPM0ffxq8.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-oRxvI7dMZMCbOLQWDQKr8GdE9gTreacZV9q46Wy69x8.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-qmL6HBqEQws9SCM8FxzFkZFup6RvrLj5tRz3etBH3bo.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-3xsHAvDmjZdemYXZytXv-yvEWmMV6WAMlU30TPveZvg.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-QKe82qe5etI4bkfMhPfFQkv0JKhd1k8Vlrd-h6dnWZo.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-gTnEXJmebaK4MY-Ir7VGtttP7m2rkVwu3-qQVaq2lnw.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-iYAyKulRlbMwqg2rqchwTp4gfSe4woHha-1OkAZVcb0.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-b9ek8cBilSq7PTSdo-vCnlsdzgJkKXm9w5BEvQ8nLuU.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-o0_gcZMIodHIFaB1Q288JDWfdPpeJ0XeeGl500JJkqM.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-nFZtMgUkzosfmvQnKq1vuItfrJSsLJ_LWg97oTl523E.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-N6zQ7y88EnYpfD2MutXyW6ld5f_oOBXANYuzxJRzgGc.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-QlcJ9PnY7mKjVvstdxcD92sE3eBsXwF0esaL2OUBNzQ.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-MEFtTavd3WgT2QvwRe80iqJHPpmyLVFByAptyMNSJv8.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-RlqTkhvCPMnOyytI50B29GrxUA3QfEy457C60fQcThU.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-peM4Ad2K0GhQ2PyAw9qG7_lGoZbyp20lkP939ALLgtI.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-8DzQKiz7t06CHk0qWIuG1Uu4LsW_wcjTHZBhy4txiXw.jar
    Aug 12, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-wpu9XKHMMXN_wCpN6xhtL0rpJDs5CZdzV9yHfcgs7Fg.jar
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-AM-8y9ZcXrfdJY6n3DiSG5gjanPVZCE9Enls_7tjf4s.jar
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-F6C8MKRfpl-pnDcKR8hAL9LSkOH2k_bHZlyhPiZNnyY.jar
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-SzNFaOvPVfJh8yeibjSAv4ZodfHTbhql9TZoDKC35tg.jar
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash aea21d719aca90ce25cb8b37a730478f9d7fed2e6bf2d42d60fb7e39078212a7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-rqIdcZrKkM4ly4s3pzBHj51_7S5r8tQtYPt-OQeCEqc.pb
    Aug 12, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 12, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-11_23_45_34-15360951116865820857?project=apache-beam-testing
    Aug 12, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-11_23_45_34-15360951116865820857
    Aug 12, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-11_23_45_34-15360951116865820857
    Aug 12, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T06:45:34.358Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:42.199Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:42.935Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:42.975Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.012Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.074Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.102Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.142Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.171Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.637Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:45:43.714Z: Starting 5 workers in us-central1-a...
    Aug 12, 2020 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T06:46:08.152Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:46:11.797Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2020 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:46:35.060Z: Workers have started successfully.
    Aug 12, 2020 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:46:35.083Z: Workers have started successfully.
    Aug 12, 2020 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:47:08.148Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:47:08.334Z: Cleaning up.
    Aug 12, 2020 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:47:08.431Z: Stopping worker pool...
    Aug 12, 2020 6:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:48:07.310Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2020 6:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T06:48:07.346Z: Worker pool stopped.
    Aug 12, 2020 6:48:15 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-11_23_45_34-15360951116865820857 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8e4cab62-f52d-464e-ac93-d1822839f10f and timestamp: 2020-08-12T06:48:15.516000000Z:
                     Metric:                    Value:
                   read_time                    16.316
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 6:48:16 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 55.705 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 58s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/gsqew7akjkqpa

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #860

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/860/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-10492] Add missing sideinput handling to DLP transforms

[Robin Qiu] Upgrade to ZetaSQL 2020.08.1

[Damian Gadomski] Moving /tmp directory cleanup of CI workers to Inventory Jenkins job

[Boyuan Zhang] Workaround of AutoValueSchema doesn't work with SchemaFieldName

[noreply] [BEAM-7996] Add Python SqlTransform test that includes a MAP input and

[noreply] [BEAM-10663] Disable python kafka integration tests (#12526)

[Robin Qiu] Change SqlAnalyzer code to use the updated ZetaSQL API

[noreply] Merge pull request #12485 from [BEAM-6064] Improvements to BQ streaming

[noreply] [BEAM-8125] Add verifyDeterministic test to SchemaCoderTest (#12521)

[noreply] [BEAM-10571] Use schemas in ExternalConfigurationPayload (#12481)

[noreply] [BEAM-10644] Mark Beam 2.24.0 as the last release with Py2 and Py35

[noreply] Use new ZetaSQL value create API (#12536)


------------------------------------------
[...truncated 299.25 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-iIAr-cAxgdedPokhajv9wCckVVjvY9QRLW-XreuxELs.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-vf1g10iX4A8f_x4LwQR1lrKpiZyjSJ3YtogrUHG0Wzc.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-iIAr-cAxgdedPokhajv9wCckVVjvY9QRLW-XreuxELs.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-YusjYIqn5xavl779eGpmXXmuBcaSq2xqeGAr9m9DZTU.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-dWe0l6b5HfDcW8h-MXcjVnKAXVMBqqjpkxITl0NAWH8.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-n7Tlzx4r-gjehr8FQ2FaS7XgCb-XZHuM2uI74XNA-HA.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-jx0d21JaPRsAntgwlCR9G48nAHaoJoXuPVvcbPzwHho.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-RcCD9bHTJht-SFeRX6y58ppCbkeATa3Z9XvGDlN-FaQ.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-w5y5RMXStfpAsEfAr4Gm3hProR5bDuOixMDfU4DEMXE.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-vHR7nSl003JU8a_E0YiSl0lF7k38WQFBAe5ViqskRWg.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-KErFaKXquiQfu0SZg1az3TArtEKVm-6YwaLQEzdi3IY.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-QhLtYHolDCmlpqlI61Q73teWv0wSU7WPjslbLjweO6k.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-H6k1Bdwjy5n7j91g-N6St5PXm_YHhuTNk49PL3MXVO4.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-5RDnjAsnUvI1OdRcVBGH4yRurhWNy1uxhZ0s-8iyDgY.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-DMWA59yPT7RtWBcPG2HRuWcCmb29Na543YTilkRHN18.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-EJRGNmV8_5fACJVe_VqS8iguU2QRWDdg7TLME4g9VqM.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-QTTmSMbo-t_dWTuP-X1u7ixWv9hVdp6z8vZHMH9idc4.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-9qZi9sC10pRUnK8Sfa_uxcCg15JxuEmPsQihCGs_fpo.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-hCLsRKGn1lNAo1yNCo1koDYMExrJDM_IVCx4lGBJXvM.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-X7eNNzUPYk6e7lEd1BOVQSZ5FQXvuQY_cMsQGeGdh9M.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-_ACmX0rgWyYwLM9rm7Jx7rMCaD8xFsNVYDsJc9Zf1jE.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2156955275015175678.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aJw8VPQpwhc5rbKIR8b9rAUPDe2C27iTRjTNwRDr1CU.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-wcaC7IsH-ihRhPTbP2h8L85m2wg1DYFQ1k1V-nqZ-qE.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-xqbPhFDRINkil2KIcTRIOEsiHST1R7gvEQxOUsZtBvk.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-dSmyUMEpPAKvlZ25VoK0QKoBOstHx2gPGRHBJ5DIxaE.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-DQsrqNoh93u1WbiJJw3dQCfYy8dMXGayBd7H8_wYiSA.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-JFjcbaH-eMUrxRL2RIS3EXuwNVTFy_i1wyoi7XedJlI.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-M6ADEBNeLDdUhviCLHWWtE4Y-YRH0u3f7PfdOlbALSk.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-aDwKrGdfKrkvHG5EBeZBPrfkOIGCFA5NOboXxF3x0ww.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-rLxsrOX8NYPHustBefmsfmt4uQJmapqmMUL5FX7H0x8.jar
    Aug 12, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-uK91V9Ao9WJrbdc_Mob-Mz3gXx11it2AhRUtH6ydBJ4.jar
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash e2128553d14a90a7be4f89b9f2dc86669d225675f02943087ff1195ddf89d95b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4hKFU9FKkKe-T4m58tyGZp0iVnXwKUMIf_EZXd-J2Vs.pb
    Aug 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 12, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-11_17_45_51-7162512413859620079?project=apache-beam-testing
    Aug 12, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-11_17_45_51-7162512413859620079
    Aug 12, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-11_17_45_51-7162512413859620079
    Aug 12, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T00:45:51.977Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.046Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.680Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.718Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.751Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.812Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.840Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.885Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:00.921Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:01.298Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:01.382Z: Starting 5 workers in us-central1-a...
    Aug 12, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-12T00:46:24.778Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:26.595Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:56.374Z: Workers have started successfully.
    Aug 12, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:46:56.404Z: Workers have started successfully.
    Aug 12, 2020 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:47:28.119Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2020 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:47:28.314Z: Cleaning up.
    Aug 12, 2020 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:47:28.402Z: Stopping worker pool...
    Aug 12, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:48:21.346Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-12T00:48:21.393Z: Worker pool stopped.
    Aug 12, 2020 12:48:28 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-11_17_45_51-7162512413859620079 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 51eb0e0f-cd5e-4f57-a83f-e51e38fb2434 and timestamp: 2020-08-12T00:48:28.648000000Z:
                     Metric:                    Value:
                   read_time                    10.578
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2020 12:48:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 51.252 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 13s
106 actionable tasks: 80 executed, 26 from cache

Publishing build scan...
https://gradle.com/s/2jbfa3kwgf2sy

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #859

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/859/display/redirect?page=changes>

Changes:

[Damian Gadomski] Extending archiveJunit post-commit task with stability history

[noreply] [BEAM-10572] Eliminate nullability errors from


------------------------------------------
[...truncated 293.40 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-k_QU3yZqKr3w131-7X2RTXGAlb6mJ7LbYV1tJCiQ7xs.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-opJVNvAPV67y4RLd7oYPVozZbVF5pQK09I22iDwJiWI.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-MgbFjrypAwgZ3sYULmumdL2xos_JzK2rY7ifb7vs1X4.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3338760398444166682.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZsDxHd8YcNuCb2-y3Ino0qccU8xbkMTgL1oMo03TqY8.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-ab_4vltCgZDRuO_EZH7WlOLf2s1bEHvRTp2qyN_0tm4.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-r6FL7f67RDOLDKQFCQPeoIscd_80HwGuzZhSsqRK5WU.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-7GoktAzt1BlPYpxI1CwPlgKNdVSaNgrPqHCFHVgquCo.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-z6rXl5gu9yrQU_RaJK3cjXWyWzU8Afx_QUEPwj8TbhI.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-zveEXk9950PyAQjZLdIkb63Aq_lVk1tLxnu6y3q1Eec.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-MRgSzAi696cBALp00citZbVmYsH3Qis5xTAHkuDbUpk.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-fIHRT7YtY23bMEYKVsJYJszzBut2_ExRm-WQvNTvulM.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-DhCOm81OAtlnu9ySpTerYoCvpvIBQuXvC9yshbonKJE.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-Dp6YMvbJrEvTytVPamusG-wdXVFw19urir2HsVF0tWc.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-uoLtZgSWvrXBg2x4o-6K4PoH53yfLHpoPJYRQKSxVlM.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-2WA0Dh07146l8vnU5nKT2VmjPwqEFyjWZgZz8Hzisao.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-k_QU3yZqKr3w131-7X2RTXGAlb6mJ7LbYV1tJCiQ7xs.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-8Q_w9oLachX-hcYhN0148BImy0URilbw9SSD2_wsePs.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-rHNc4T3dAnLTqbHzw49SFQ87Y4NCGAZ-ylSMfJ08D2U.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-smzXkffEpNCmr_3tvu0s8DWThBQ6mZQlVavaNuwGf78.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-mkV4nX1xLcPbdmXlHQdFsN3Y_9ztpio5C6Qdm1hkTWw.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-WU004iSb0u_FYOCI7p40N_OqdqB-ZT2NO7XPtyxMsJ0.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-6gp7W2nldf5oc3-IKNQlQPg2sRnXmvKd6HJF9hNXzz8.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-ZY2NJS3rFI399ziiQC3L-nm76yeOFf_rwXF7Drt1__s.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-KKTjkVVb5Ur-kI9xV9uRWG-dHLbFYMmV8cGYUI-B1iM.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-ACH0tJgtnHyXPWtKBFzALRnzYjvhRQ_jic_w66OgnR4.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-2NMs6QMaIaMwnYiEJFVSbcS2Wr2eHPk4M-Vw_nB6DC8.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-TEo7PsGxevTaEEXa4D0VFQfESphvQ6aQJ6HN4kpI6SM.jar
    Aug 11, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-UBVngXA5HeN3orMR4KcF9aI3reQoBX90BcYm6zxDKEQ.jar
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-3kP1xhhFesszlFVk3fdD_T7p3NQTyNxTYvinr2AHIq0.jar
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-hfMxfF-mOh9tkhgYxkyJC327e9tAerFw6dpwcfVe-XM.jar
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-D2ox-SW1DQuT6kAKgZ0kPpBcASxV7M1Uf186yDGVGp4.jar
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 211f66b4a203e7b7a00841b058915f6f8117f5f093ab171761adc84b7f461125> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IR9mtKID57egCEGwWJFfb4EX9fCTqxcXYa3IS39GESU.pb
    Aug 11, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 11, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-11_11_45_27-5819371263453255362?project=apache-beam-testing
    Aug 11, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-11_11_45_27-5819371263453255362
    Aug 11, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-11_11_45_27-5819371263453255362
    Aug 11, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T18:45:27.407Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:35.702Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.355Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.430Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.541Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.618Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.649Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.682Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:36.714Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:37.305Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:45:37.386Z: Starting 5 workers in us-central1-b...
    Aug 11, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T18:45:51.702Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:46:02.532Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:46:22.550Z: Workers have started successfully.
    Aug 11, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:46:22.574Z: Workers have started successfully.
    Aug 11, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:46:59.776Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:46:59.982Z: Cleaning up.
    Aug 11, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:47:00.093Z: Stopping worker pool...
    Aug 11, 2020 6:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:47:55.507Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2020 6:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T18:47:55.545Z: Worker pool stopped.
    Aug 11, 2020 6:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-11_11_45_27-5819371263453255362 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9a1dfd5a-8222-4f5b-9dbb-e280deeae860 and timestamp: 2020-08-11T18:48:02.221000000Z:
                     Metric:                    Value:
                   read_time                     16.84
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 6:48:02 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 48.773 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 71 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/iy6lqwnwqf4as

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #858

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/858/display/redirect?page=changes>

Changes:

[noreply] [BEAM-601] Run KinesisIOIT with localstack (#12422)


------------------------------------------
[...truncated 300.64 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2020 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-rid7PWUjtwguHVnbiQbb8fljwButRY_eqBQwhR5FlkM.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-gtxJ-4p4haZMYfoT_SLTN92gQQYKBXEw0ZOZVFVNbkM.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-qEB-wFn1oFHlTquqDJe5WTlzbMr0e7fgVH3SlL12f9c.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-oPAW9Z_b5LOgGI-AMzGpMqhGG2bK7ItjdBS473biYc0.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-2mZi1eBfS7nW8SZu5OB-1D2j1o3R8D0FFtf7jM5iO8Q.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-Y2Drps_dtkPwdA8atMsaM60hAzE5W8x8mY963cWZ914.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6122021083647084166.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JZmH8MmjagdeQnNXEZYk5hpVhM5IXpLvCSk_rjh526U.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-FPCC391zCNHTI8lwCBUsI8iqJecp1euiFLHo39bS0vA.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-7g2Mrz4BVTh0rWS72Z9WSvhGEH-alN2k8miHWsjkEww.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-y-v1yCKj8wYB_sSjaLXpvvhKpU2f1x-01qINYKdYW7E.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-otpOki70-FdFuHNPGj_d0SnVRD1i4kriNQSXMzJnkSI.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-eoE5Nj_PRU-wWp8y53PcFOHcduRyGXoHmI9x3E_NiNQ.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-Kp1QOBPw0loqNKoQIZMy-JFUOIcKVrDNx2LLngrDA38.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-UJ0xts2E0HHIUKU7pjXo4SMg3jQXESFqersbAlGsqio.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-yeXrybxMRm5JSkkJtdlinPKfUsQRevW70QhJi96jYP0.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-n-sHG2MVnSZkxmpxwb_aY80pu9T05M0DVLPV6ul7n0k.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-UJB-8AOUEXu-ygxSFa9IbL0i69Y9R915Ps0lFc0UBrw.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-9MXYxcaXHv9aZhlmlJjnseGVueACzmhD-6IoJUaT06U.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-e_D9xlHk90aa5pR5XMNqrYh4J6ChyAKiSD3Cj6yg_Lw.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-QXhs9wz8K4M37WAvgxZylY7xTFgHp32RZv1psmYIi6c.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-2Ue_W3aqV-7_iaZbQ2lsiBH19hF_cODPGX0SYjhPe-4.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-4h7WoddLAyU0isKcifjdxYj5DmCL45wMd8rFojLxxb8.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-VFzqd5DQpm_IoIVGd8J_CMaEnI7RwJM4_wAfBIkSDAM.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-JdFdKquwKWyQ8HLP9I6qdPN-1Cu4kzG_tr9wAMAuLIg.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-yFV5GFYHsSHUfAJze8XQWhP-OQC5jgaYeNJcRZPx1h0.jar
    Aug 11, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-rid7PWUjtwguHVnbiQbb8fljwButRY_eqBQwhR5FlkM.jar
    Aug 11, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-GsFZyP4Bu7fNd8I4095xw8oy5HWScesh5_cCuNVffow.jar
    Aug 11, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-k87SZSrN331qbppbLpBq1eYFdWxuZ6OEiunhuCwPlGE.jar
    Aug 11, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-ucRgeppaeVE5AOoE1C1VxuUU0k_AakDS5SwmRcFJcKs.jar
    Aug 11, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-x4xCGM1UkoBqjacbSq9leO7GfQR4nZeRvLnwnM7TqwY.jar
    Aug 11, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-VQaMdvaXLw6GYkbcfRnZ9BwSe--twPZf9mbgnE_H53Q.jar
    Aug 11, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash efda8f5015b70931ebda01f6040726f3b0362a383ff408c012f57bc7357786fe> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-79qPUBW3CTHr2gH2BAcm87A2Kjg_9AjAEvV7xzV3hv4.pb
    Aug 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 11, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-11_05_45_37-9031122224296124683?project=apache-beam-testing
    Aug 11, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-11_05_45_37-9031122224296124683
    Aug 11, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-11_05_45_37-9031122224296124683
    Aug 11, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T12:45:37.679Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:45.414Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 11, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.145Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.191Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.228Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.313Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.344Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.385Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.420Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.841Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:45:46.934Z: Starting 5 workers in us-central1-f...
    Aug 11, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:46:10.830Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T12:46:21.439Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:46:31.542Z: Workers have started successfully.
    Aug 11, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:46:31.580Z: Workers have started successfully.
    Aug 11, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:47:05.599Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:47:05.791Z: Cleaning up.
    Aug 11, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:47:05.880Z: Stopping worker pool...
    Aug 11, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:47:51.073Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T12:47:51.118Z: Worker pool stopped.
    Aug 11, 2020 12:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-11_05_45_37-9031122224296124683 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a384ff22-509e-4841-b557-8e9ce5401c9a and timestamp: 2020-08-11T12:47:58.873000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.012

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 12:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 37.039 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/hzl7mvgwgka5c

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #857

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/857/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10300] Improve JdbcIOTest.testFluentBackOffConfiguration stability


------------------------------------------
[...truncated 295.05 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 11, 2020 6:47:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 6:47:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:47:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 6:47:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 6:47:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:47:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 6:47:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 6:47:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 6:47:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:47:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 6:47:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 6:47:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 6:47:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 6:47:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2020 6:47:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2020 6:47:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2020 6:48:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2020 6:48:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-RVrOsg7kLPtWgf3OMHD6HDXoKFJf8S6tTC42hz8VtaQ.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-FgsNl_fbDdsbIHBZrnWrZ2Xs0wMKS43MCIlKOs5nRfA.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-n8YG2WVlGQof9w3voyoO5JzQasscoaxf9dHGeaOu1QU.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-WqUBv2XqD1B_eNJj6O3vh61bACiwjhLGi6slDkUyCDE.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-9Z4s7tI8Du_gPO53yiYcaM5rBVoFqOKQAurk1gdDNn0.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-4WBCUn1ZZkyXm9cnfBhTB2hZZvjyB61EGVnIbTExxzc.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-FQ3K95M3w1PpBCD_5RR2LmSQDxDPqyqSvA3Hwjs1CJY.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-cZJJFRBE73NhXXLPrpZDjfhW0zEiwB0gtm9djJANv3A.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-CkzkpV0FTuAaJA4JodbxxNXXqYQHb15WqHGDS2qEqrw.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-IZQo_12JjkYnf3IwaUuVocHB8zODejpd_Cbtn-L4HRk.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1805071332023558133.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RWaGRFWA7odzI90LSs8kgUDxLbESAzIyHI4W9jBjFEg.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-JRlsxEeZOM2R4jNEpzE0Nr1uQMo2DzAUZc0OMWsMP3o.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-czaRiEFiHSSQ6phlv8qp-gyV80i95_ss4fsxNAmauMs.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-24RGztAm9tcqaysc52Sp9aOS9uuzv1HWkQXE4D54qPI.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-SY0kSf95duuXfJ1KTi2YYw-GMllMafTuLVvRF2pSLxM.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT--eyGcJwc08bF5cZRigiHdshjerFe5ikb6HwwXVbI_KY.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-hDFR5d5sj2_LiU9C0TF7ViuzGA6ezCESlUvHfFP7eOA.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-epoSXZrMmat7z6VwaUFE2pwQWykP6cw2eYwCPh16Dzo.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-CHB9zXHqtgaOepBz5-BlfajBI_h7FlXbudgrOQI9D6E.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-KgpwIxz3ExtJhtuRoLm3IIQbHQ3BsYzYOoEXls5zfXY.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-RVrOsg7kLPtWgf3OMHD6HDXoKFJf8S6tTC42hz8VtaQ.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-isi3MaMeRiS_qBP9g7sDpYbn82De-FfXyOj8ieqjE6o.jar
    Aug 11, 2020 6:48:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests--f0o_pwXiksik_N8vYNthupfwA8_fz8FxGq01ZrLsvM.jar
    Aug 11, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-kx3tMxa9lco50EBufhQ4Q2mBXWe9kjN6Hoz4yhhPPO0.jar
    Aug 11, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT---E4zm6F3MbkJV2OP0MJfe7bACLl_92j0W6_frmoiUI.jar
    Aug 11, 2020 6:48:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-TEoZvGmspUEwuofcgFTXTH6e-SkZFNvE7zXKA4II8qY.jar
    Aug 11, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-9n8ZPEooq9S03thbLX4Y_bcIGUN6DGBrj_F6Ag3X0XQ.jar
    Aug 11, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-eZETnqIQEWu8ex_8KhKgI9UxM4v3MBImaKkLN1g9S2I.jar
    Aug 11, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-27v5nAm3Bc7gFeBwW9AMvacAebjMEuNWOOz5miCbVIY.jar
    Aug 11, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-HK6SpdGxuFhx6cekCAV28JZdi3uDyka4uqzznDfkZtc.jar
    Aug 11, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-ISbQb5Qos-b9TKI1rmAkvS488W2iQ2gtOVcYRXipzO4.jar
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 8 seconds
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2020 6:48:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 326b3263da19dc0e5d1ae79d18089d35ab1e5eead88af1639f58fd24bfebf96e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MmsyY9oZ3A5dGuedGAidNaseXurYivFjn1j9JL_r-W4.pb
    Aug 11, 2020 6:48:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 11, 2020 6:48:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-10_23_48_14-13221072699101219476?project=apache-beam-testing
    Aug 11, 2020 6:48:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-10_23_48_14-13221072699101219476
    Aug 11, 2020 6:48:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-10_23_48_14-13221072699101219476
    Aug 11, 2020 6:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T06:48:14.589Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.062Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.641Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.683Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.713Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.791Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.817Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.865Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:22.900Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:23.252Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 6:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:23.331Z: Starting 5 workers in us-central1-f...
    Aug 11, 2020 6:48:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T06:48:33.920Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2020 6:48:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:48:49.692Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2020 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:49:14.931Z: Workers have started successfully.
    Aug 11, 2020 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:49:14.968Z: Workers have started successfully.
    Aug 11, 2020 6:49:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:49:47.055Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:49:47.257Z: Cleaning up.
    Aug 11, 2020 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:49:47.342Z: Stopping worker pool...
    Aug 11, 2020 6:50:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:50:41.217Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2020 6:50:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T06:50:41.271Z: Worker pool stopped.
    Aug 11, 2020 6:50:49 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-10_23_48_14-13221072699101219476 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5c71d449-fa0d-4e4d-909d-c9aa688954cc and timestamp: 2020-08-11T06:50:49.346000000Z:
                     Metric:                    Value:
                   read_time                    13.132
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 6:50:50 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.094 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.083 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 23.04 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 7s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/s2isbupk6bs7o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #856

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/856/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9891] Added ZetaSQL planner support and uploaded 100G data


------------------------------------------
[...truncated 292.51 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-d50hW0sZJPw5e_p_L9N8LvNv0K4W3woELgZgfHg7tjg.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-pe0Ocpip_nnBzMcu-mWJ51ozpbsmsps9sfzYsjF6I_M.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-HZwCii35Wdy7-lXJ-X_4w1NVQ-yP1gmHPvbf78ywRSo.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-WCZqzzi5COcdn7oT9LTSZFZhGKVQEVxO-q1euc58kt0.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-NSwuu9zSkuJRnl1QVTjn0QRXd8IxrSP_xrQLWuoKKqU.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2816513599780113584.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ByWc5mS_bSa6s-ZlVAgPshrd3JxCIZrrDsN7WOu23fo.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-n7mo6DnOD7vtU4NSNcv02h6GVxAin3rq3VwKR0e1Qw4.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-0WHNjexZKRi_tNwGbH0njsqiGjaWnkL7lXuWAAFC8N4.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-S3uq4XatVRbOOtbLyhmHt7cx0f0WdAAtMRmp24ZROy8.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-OO6usWdHDcyJz52y8s7kcEo6cByaLA-xAWqaguOoolI.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-XmB5CU_6qszOnz7XaBmc4LSSYgGfmmjI1lS0lCVsUb8.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-howLwKs5Yk0bGGiM0CT1bzYKra637KHWAyu4sNRhb2s.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-jD3SPiz6blqq9cKZH28xMFEP4mTGH4uOVmwUVpBF0D8.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-d50hW0sZJPw5e_p_L9N8LvNv0K4W3woELgZgfHg7tjg.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-TYvuUHV794uGtErM9AYKBV0NScRpNaQJS73zV1fLGKQ.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT--UpC9U-I_W8Bg27vSIQDlYldVRNYCdSxwSPpZKxlS1M.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-r-YKiuMGXZ5xalaMVeqnXy98ri-BKeBgoPdZs33X8kY.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-wAy-Hn29_bbbJem4eHhPW2bOqRFVYpnxjaul0G5-zFk.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-IkdehQC3UL6bIgzIcuGiwnrNV7ipnSYm2p1JN9f2yG4.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-tPaVADpnEoGw9LrbEssbWeqNHXLVnMJmSbvrWt4sKHE.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-ZFn4M4qDV_MZI7wU994w2y-5jLl4rW4XMutNqxmJ5LA.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-leReFh6LM8DkczQ6RrmFsDSCnM-JRSmXDT_IICy5QBE.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-76CJnN-uOP_ni47nBWR2s-veoSpob-BhxP23_F6_feU.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-JGQ4LMYbq2ObxDMY_ngsga1LAEg2ZxO2Wv8vb8XtzzM.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-k6rOdxzKbNeHbImJnuUiIcEsqa1d-TQy9xdv9vnRtvQ.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-ZFb-XFS79HQb-lsWkUY7_suuDsvvSB1g4LNNsBE94Dw.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-9sM9i-MtIzXWVz_hNOqd5w2nx6D4kmp9_Ltlfzi_lFs.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT--JfN_hh1lYIEOzfw_dA_Kxt74a3DbN2rUVp0BamaESc.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-l-C863hkHecFTtX0eTTT6Exnaq1SPVy2XEcVn2iJz4Q.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-p964o-spHdAxUixHXq8Nhm-EJzZTsU-MEVCRFaKC4wQ.jar
    Aug 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-P6ygSDyYRCFbplK0dG5OiTsXdRkt0XFwYXgBNeH6AFA.jar
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 710111220b4d6e0d2d6fa9ee295bdea95ffcf000c61b63330978473db9ac4ea6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cQERIgtNbg0tb6nuKVveqV_88ADGG2MzCXhHPbmsTqY.pb
    Aug 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 11, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-10_17_45_20-6290686188624170660?project=apache-beam-testing
    Aug 11, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-10_17_45_20-6290686188624170660
    Aug 11, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-10_17_45_20-6290686188624170660
    Aug 11, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T00:45:20.112Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:27.832Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.484Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.526Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.554Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.615Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.647Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.683Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:28.707Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:29.094Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:29.177Z: Starting 5 workers in us-central1-a...
    Aug 11, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:45:52.575Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-11T00:45:53.913Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:46:11.627Z: Workers have started successfully.
    Aug 11, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:46:11.656Z: Workers have started successfully.
    Aug 11, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:46:44.875Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:46:45.046Z: Cleaning up.
    Aug 11, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:46:45.132Z: Stopping worker pool...
    Aug 11, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:47:38.553Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-11T00:47:38.595Z: Worker pool stopped.
    Aug 11, 2020 12:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-10_17_45_20-6290686188624170660 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0ad34505-b91a-453c-b263-3a79682f2e58 and timestamp: 2020-08-11T00:47:44.728000000Z:
                     Metric:                    Value:
                   read_time                     14.12
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2020 12:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 38.668 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/2lf5d5p63orqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #855

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/855/display/redirect?page=changes>

Changes:

[Robin Qiu] Move value conversion logic out of ExpressionConverter

[Robin Qiu] Simplify ZetaSqlBeamTranslationUtils

[Etienne Chauchot] [BEAM-10471] change the test condition for testEstimatedSizeBytes to

[Colm O hEigeartaigh] BEAM-10668 - Replace toLowerCase().equals() with equalsIgnoreCase

[noreply] [BEAM-10361] upgrade Kotlin version in example (#12497)


------------------------------------------
[...truncated 293.33 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 10, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-34wo3LNHYhzm2ze8RhQFRSzAne7g4tDL1CQVJrTeVHA.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-F405bLntf6jgksl1x1p6O3s3p20IFELgmBNFyUjT8ic.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-zyKBGYQNMkd4qVPKevjG0ltrBeZqSoZ5xCaYiggth9s.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-9to2JVEOgjGHktPSzR2Y5Gdulyyp6neHtesPJ0XWHIc.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-GkBpIyexg1NBO9mf6l2YWw89re_E6J6kqvMth-r6aWs.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5171792243846878453.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-QG_iNnwSqhPiGGNw9wZQloZdErTD5lWJpgutGhSCSnQ.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-3xxMnNYjVY1qHhuvZwvus9nxDsxdYMYrE9dUgaunq1c.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-RT38T1hvMpjYBiPwCTbxX1ZxIOPB6bKzfvwyzZsT_yQ.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-5v_wLzFk7TlNRZx7HE7Ngl4KhCGAuE9wnk2TvLIH7OE.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-gtiAXYQI5vdN7m2opE8O6p0bayilQcTfd0DYsP6axJc.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-CJUCvnaqA1FfU03bGqePuysjukdmA8LnPfEMWBsoP04.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-hYYTmC3riNukGcunaIPrOxv0aRzWO2So7Im1Bb-vWRQ.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-P5-aihmfJcVKbXPolw2SbaT2vApAKNOLYbfRUFx8j-c.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-UgrSosLRy7m0o3IYMDkvw_VL9WvW48e7lSUmdU7l-Nk.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-9XGFq3QhV-slZLBXttng6BYHZVDKNi2uw_aE1gv0fk8.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-jggIwVM0t4cc-R54SoH60BS6WO1EoFIEAC3WUeWuxVA.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-TpuE2HJZxbdFDiNcZ-ep_6tDjHTRLTKOsBhr4ZyQfSU.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-gK8p5Qv7rh7W9dZ_H_RfuL24aIEYzqXiPrMm9PUQasM.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-34wo3LNHYhzm2ze8RhQFRSzAne7g4tDL1CQVJrTeVHA.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-Sik75iaPxW_kfMnOpvOybdLsz6qH5E12FOspOFcbU9Q.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-TYdgFilyMYUDKv3IVVTYhuc6JOPjVr-WDEqEnU6_kNo.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-KipSfrt4eivsKwjquJ269dAOfZEvRCFgGREv40NJwYU.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-utrHvjbHDK6ytCRQZ4shiH2O51qqZZuFipFlDzfp2TU.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-xMwd2RnYHwvyLhD0lnM3RUC9AMVi0ONHdpQ8s4m-4Aw.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-ontQDdJlQxLj6Jz6WpSE_uG-jG34Rr8FjZ33jdyD1_A.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-TIxCnppE2PD3LCO1qrgqkk80c_qWLNYQ2m0ndty-k4I.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-1D6Gakt7hCKox-3ekMGsWVh6wHNY-IliRfD3o1Hs3Pc.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-LaCCo8zS-JAIrsAQai1rJxfux4uy-bAQG94APuubFuw.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-k0DRrsOFxrCbsxiQTY3OmHdrehs_MDSiHHIV5rJKI7Y.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-6Rlh_EnVV1o_2P29-Px6B4yKYovdPZ_FSvET97EzgL8.jar
    Aug 10, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-sTnJG_e2txbbAhk2jeZF18l9nBFe856knJtBHn3qIX8.jar
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 5ef8a98b9b1ca0a60e97771120faaca30f72efc5e7f9953e85f96797efdd5595> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xvipi5scoKYOl3cRIPqsow9y78Xn-ZU-hflnl-_dVZU.pb
    Aug 10, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 10, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-10_11_45_29-13922823160326413692?project=apache-beam-testing
    Aug 10, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-10_11_45_29-13922823160326413692
    Aug 10, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-10_11_45_29-13922823160326413692
    Aug 10, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T18:45:29.862Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:38.119Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:38.816Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:38.893Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:38.924Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:38.981Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:39.007Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:39.037Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:39.063Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:39.495Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:45:39.615Z: Starting 5 workers in us-central1-a...
    Aug 10, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T18:45:54.343Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:46:06.369Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 10, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:46:06.409Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:46:11.794Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:46:26.668Z: Workers have started successfully.
    Aug 10, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:46:26.695Z: Workers have started successfully.
    Aug 10, 2020 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:47:03.582Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:47:03.837Z: Cleaning up.
    Aug 10, 2020 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:47:03.920Z: Stopping worker pool...
    Aug 10, 2020 6:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:47:55.308Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2020 6:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T18:47:55.363Z: Worker pool stopped.
    Aug 10, 2020 6:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-10_11_45_29-13922823160326413692 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ee695257-f715-4ec7-9f3c-beaff32a2443 and timestamp: 2020-08-10T18:48:02.972000000Z:
                     Metric:                    Value:
                   read_time                    15.123
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 6:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 52.927 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mylkrhoqqeqiw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #854

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/854/display/redirect>

Changes:


------------------------------------------
[...truncated 292.83 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-YlE6-FMYvCiuNnQ6wWeL36LkAWdVasUimUoa_RsAiBo.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-S7s0PSpN9nK33O99IwkdjomcFap4Y_SarV_qBCUGS6k.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-u3G158fMPC3J-jwAlbO7BNcQKGEZrOjRnyULLJXTDYQ.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests--p1zcIqQKUTHQPxsWnhW3LrzR8wACZIp2BOdw560NUI.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-dgFaPHwUaKsGbP-XkSv6NAb72rcgZDyRree4rMwgLag.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-QhsTZNzm4tZLBJ89BcoDCF1rp05Jq02yzFTc5yu4giQ.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7113546457144953729.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vQFaVlFZDTxo5MDxMbu4NSvlaESLgTIiNVVKnaUWoRE.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-Qh0_qdNxcgkToE3SFsfhxbZgHixIsrONwhlPRFmh4sA.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-0zWI0Szc-mFTw2kizSjglTp6s05a0nAtnwyQWlnds3Y.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-77zs5Gy-JU61GThJx5a2knc2PMo4OuLDYs89O89n_oE.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-YlE6-FMYvCiuNnQ6wWeL36LkAWdVasUimUoa_RsAiBo.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-zgqODfExdeoTl70BsQEmJu-Y3lnFuSNkR3DdL6ewco4.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-2LIUCs4hSq2c26uDvrbbg7joaVrufQ7U3LFr4BJWxsU.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-xSPLuDrc1r89vPzSqbp9eDLN_qr8l5FJMsLkdQUYPvc.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-qsVbYVSgrlJWyehdosE3PGq3BVfPgpT7IIw_Mj0140c.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-MuzRc4reTVCp4A_CzbQpX7iFRl_fCPsyd4bdsK7Z7sY.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-y9U0EwnxwfMnd0MibM-imK47x4f6yBsz7jv4_wt60Us.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-LqNyNIcbdclOinMmdgQTdUhOhkw1zGkeV5M-vYHR1fU.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-O0Ljeab7zhUAgGcIhTHjT-90BxUucPYDXq6T-Rochck.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-lBkDDt3rNBIaIXNvwBdoFhvaUfAD_FGOkfibIPdWJNw.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-m8YVQ9ADRzdqmjoyE-qlNRd7eBRw_WCforCx5eLDpW0.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-333W35SwivjIJZ78C9xiVUfwwmbFbaPfSpBkVST4yI0.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-6XQ2ECtjZBCR2JN6IhCaYo_erTtXwzQ9qTWLyu3nfgE.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-ODsCMLzo2tOaEO4hyypEhhD3S3O0qWr0fWj-EsKVUBc.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-XEiScsQWL3iBYiG-jQKJFydc09O9GSDFYpl0Gvbe9vk.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-SSLxPh-HC69Pk1EdW-cQKGL61_fUsab2ulXtjCqOsAA.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-nKZAbMZ0TBNeY9eN3Ow5ymAiy-jreG-raNletxJVTW8.jar
    Aug 10, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-nsYysOwYMtmJT9Nbc1Oi53N7uI8R6LO1gpCI53NUevk.jar
    Aug 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-iQx0DrfGb4TXaF6gW2jbZELwHwORp7-XmrdFmbHxEb0.jar
    Aug 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-N9x3snZN6ltBO7TcJdvZL5j_eMj1k-Ki7yh-IUVBqeQ.jar
    Aug 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-QmqS3eYZvlg0_f7DSi-QBZedX_lOUiqGVryqed2mYDc.jar
    Aug 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93835 bytes, hash d2bc0e740d9dce81135a2168f4680291c61ef93500dad5094267720b4ca2cd94> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0rwOdA2dzoETWiFo9GgCkcYe-TUA2tUJQmdyC0yizZQ.pb
    Aug 10, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 10, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-10_05_45_18-10521540799365639370?project=apache-beam-testing
    Aug 10, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-10_05_45_18-10521540799365639370
    Aug 10, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-10_05_45_18-10521540799365639370
    Aug 10, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T12:45:18.495Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:26.301Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.031Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.064Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.108Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.184Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.220Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.246Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.275Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.757Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:27.828Z: Starting 5 workers in us-central1-f...
    Aug 10, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T12:45:51.677Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:45:54.526Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:46:15.376Z: Workers have started successfully.
    Aug 10, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:46:15.407Z: Workers have started successfully.
    Aug 10, 2020 12:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:46:46.098Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 12:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:46:46.287Z: Cleaning up.
    Aug 10, 2020 12:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:46:46.372Z: Stopping worker pool...
    Aug 10, 2020 12:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:47:37.472Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2020 12:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T12:47:37.529Z: Worker pool stopped.
    Aug 10, 2020 12:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-10_05_45_18-10521540799365639370 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5d280c67-84aa-418d-b36d-deb4c987acc6 and timestamp: 2020-08-10T12:47:43.962000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.039

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 12:47:44 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 39.937 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mjhigh3gtq2cw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #853

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/853/display/redirect>

Changes:


------------------------------------------
[...truncated 292.41 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-PM2Z1koBgy6Se4tj8VeF6z3DNrTz6Ud5TIYTCgySm1E.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-Wwu5JQed4U8VgmDTAIdt6moyxijgxjJPyrWhUojzhRg.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-PM2Z1koBgy6Se4tj8VeF6z3DNrTz6Ud5TIYTCgySm1E.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-0kpsfhbdcC5wPF9oMHbNvolO_0EaqN50cO4-iIbDmQc.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-ohwg8wNWBwP6oNtF_4mRXxqA8oXzOlVwX1ujpX5gP_8.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-tGcfZ_TdOhUsb0PSjV0L_ks1WlIwgsavUtImYTanye8.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-veVnEqqBmIwHWAAgXIrB1Apl8FitHUuDKksJllk7gEo.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-FGjDty7fxmSqkRflDu8jO2uFwPNahd9IlEG-BDcsv9Q.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-OPUrsdkqr4PB7wfYzL-deIQjE8R8bnWX_9BbX85kre0.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-pBaFWUs8Ng7Fr6WATJQyWmax0_zlDf7DX-62uMGokHI.jar
    Aug 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-b8qUmfbqNcluhHEQ68krYfaJBXa33o1_h-RoO_JkO5g.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-8RgNsrIqdzqexmuIjnf1hG-pjBvKYCZvbnICW9g5EMw.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-H3cXSY0JVOu7pc68Hu6ViiGhjfMUXBwt-b1q7QX4E3Y.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-Evk89k7uhFreD9nzBqHKSFvXmEKZGHFO0w5rXwPmkxw.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-qfc3mEuYjA0fflM4QgVlLZyH7TCH6iulGukhJbkKICU.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-_THsMQVF0xDbcb_jQtBb4qdCAz9zv4J3earxJ9w3JdI.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-_yFsh2OSLQCzCPidsXifuCF52zHyH5IhO67NuNNinps.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-wM5na9gd4sf0PTyhvrG410FLP7zRjEWilztiQQLAfkY.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-TOQ8oGCrs3BvKYAjBNSOM29L6dP6e780SUL161cs-nU.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-EUarDZanqTp1P9qeNBpwby23vugXMcwsGrAQMDmTcBY.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-XsNLsm9lKDKnIUD3Qxw2BfxHbU_TpbO8dmOf2DfWHWs.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4177037022643843865.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-HM0oKR5mjvfGBBi185P9obMRBX94LphOS-SnwpQSBNg.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-JPzbwNIAthkStF3cEeUN6QdR03NDSkk8Y2wKz9gRclk.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-y8RDzqe89rLQEqqiL93h64j2yYtKbfXpAWAZeb78gd4.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-xvzArkrbozuTE090bKe8SujGBpqjpDneWewCCIPqyFg.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-BQ0QoTTcEZ3_2YIHWVp9JNWOCrpqoZxp4FmKeQtQUw0.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-umEJFANul9qFBbNrzRcGP5IpNyP5nOEAPOHVcEbZYNA.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-al70x_6GoWfxwrPeE0nwhv5ou6pTVujEtv_Ohn-N7P4.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-8ABu1KFtJ5oilYoEfGKOosNnRGGeiMQQaUlCcy_XZ3Q.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-1mLZaFT3HgvS8jEi8IDuIebaf2h510n_ojK3VhUCk2c.jar
    Aug 10, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-AxnhSlRF5sdyIbRGFNXB_pCYSjtTTWa21YDX6l-aj28.jar
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 3ef5a6c923e5b640caa85d0c6c4bb8e38804aad9ef45daf61217490b0636a0f8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PvWmySPltkDKqF0MbEu444gEqtnvRdr2EhdJCwY2oPg.pb
    Aug 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-09_23_45_15-4947355697951739492?project=apache-beam-testing
    Aug 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-09_23_45_15-4947355697951739492
    Aug 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-09_23_45_15-4947355697951739492
    Aug 10, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T06:45:15.754Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:22.762Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.709Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.754Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.791Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.861Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.896Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.930Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:23.964Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:24.306Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:24.376Z: Starting 5 workers in us-central1-f...
    Aug 10, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:45:47.565Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T06:45:49.102Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:46:07.571Z: Workers have started successfully.
    Aug 10, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:46:07.607Z: Workers have started successfully.
    Aug 10, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:46:45.478Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:46:45.728Z: Cleaning up.
    Aug 10, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:46:45.830Z: Stopping worker pool...
    Aug 10, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:47:38.331Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T06:47:38.375Z: Worker pool stopped.
    Aug 10, 2020 6:47:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-09_23_45_15-4947355697951739492 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c1b01708-d951-47d8-9968-d3cdd361753d and timestamp: 2020-08-10T06:47:45.335000000Z:
                     Metric:                    Value:
                   read_time                    18.526
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 43.222 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mfnmebif436b6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #852

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/852/display/redirect>

Changes:


------------------------------------------
[...truncated 292.89 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 10, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-V944yQwasQx-9RZ4-jcYRMAix0noa2sRAmq4Lm7HwoM.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-lIlIMf80-nYQIgp_NNTV3M3mrskyieQ3QydF1lkMAgg.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-V944yQwasQx-9RZ4-jcYRMAix0noa2sRAmq4Lm7HwoM.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-NSKoTCns-0gl-wq5PtQPpG9QeZIDPxjP8PtZYTEs0AE.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-LAhgpf0Vki-3JgPBmOOECQUDCtE_VFoSi4f1fmRpAy8.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4580094000394350763.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T5Hmdm2gqQ3XEcY7N9n03NsfP9JL8w9gQp2STWB4t1g.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-uW0kPWzzN53jUYCmHdGsMokzugXM3hKw5LSnYdpHf40.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-82JwIZ-FOU-XsOyTG0xQM1kSAaSG7SLQw83ESTgIFAw.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-okyUiyIVBCNPWQFGzJ4YERKM3CHMibZI6PcUkABPhPg.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-GqxiI8TDT7MJgmymVKKXFZmi4V2xj11hMyhq5GzyP-w.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-aeB0MoxLLjA44PJkBtCcjEL1rFcSpkQ6RxZ2VOKqMiI.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-7SqUkPri9yvnY2xWRaY75ml-TjvEHYAaz2ZqGxVU_OE.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-bLeZmPPtjITEOMKJa1gPGTdBm-NvmGfMIg_zE-pcD5o.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-rHu6fzYV8zklkQ91Nk4rWktkmEEpRWfnHlFo39vOXQo.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-YjW_78tnNTic1zc4IMFtGGeaN2wB6_TGUfbFlHHwvy8.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-AUC-S1nWRjkm_49HvxexYeVEow12TsHkqXZl-RPe2XU.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-m_dca8YJVkNMZiaMnDVpgBysAp1HfDsHSyurOa6QkY8.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-GEol0qPF5jserlUZXZ3PidojKcHDhd1xPN8a7nd-0ro.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-uwO02eiya7_1jXCNtksBhyK5vsE7HdsXkgkX2P_gmkE.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-RAo0soKhYJP8t_TMs_YlRDq9RGChu7Rs8bS9dIeK8rc.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-d4ELW5GdnKgZPSLfLjZjdclLQNi1okZaqZ-9wWDQaxY.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-RraEELmWQUPaPpXNdeOumxsai3_Qi8Ku8c92wZU850w.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-bP0M4O6LWJO2C8fG-Bht2mf2A344C042gHR2eKJCdzY.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-FRXcD77HjIdfyaJXZm5YXGaX3fpYmPeaj5DBeEhQILY.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-P8EMwQPE-SD_5K5cxEK-lmsb6MHsctsrCfVD9EMtpJU.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-ID-XeXmQkYkjZG51Rqpd6whOt7_uYEnG7NzPN2Uw4JI.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-BTzB6zHjgRAQjNLIqaJYROsteWu33p7fg7DbfdoM458.jar
    Aug 10, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-KZdqXRevXW-hkH1GyTtcsRo0Qxozo0ebMmdVSIEFhMA.jar
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-GqvJ1m36RTujLvlZhdKnJz6C2Mjud6Z9x148KFgxnpU.jar
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-_Do1BeAq7DS8d6NfR1U75QlXSQ-C6Ra7nR4oFEnXcus.jar
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-QQgfaXc7kcBdignSHvf5CODtI-25EYkpXYGGpyAMRGA.jar
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 70f355c8252f03565569d80ad316dbee95fe972a4c919b44657e65a09e65812e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cPNVyCUvA1ZVadgK0xbb7pX-lypMkZtEZX5loJ5lgS4.pb
    Aug 10, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 10, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-09_17_45_20-10691281356357235777?project=apache-beam-testing
    Aug 10, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-09_17_45_20-10691281356357235777
    Aug 10, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-09_17_45_20-10691281356357235777
    Aug 10, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T00:45:20.350Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:28.520Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.244Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.287Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.313Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.389Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.420Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.454Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.489Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.911Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:29.993Z: Starting 5 workers in us-central1-f...
    Aug 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-10T00:45:39.073Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:45:58.223Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:46:26.184Z: Workers have started successfully.
    Aug 10, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:46:26.223Z: Workers have started successfully.
    Aug 10, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:47:00.199Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:47:00.409Z: Cleaning up.
    Aug 10, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:47:00.501Z: Stopping worker pool...
    Aug 10, 2020 12:47:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:47:55.136Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2020 12:47:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-10T00:47:55.176Z: Worker pool stopped.
    Aug 10, 2020 12:48:03 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-09_17_45_20-10691281356357235777 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d8cc3473-5ae8-43c1-a750-85dceff26e98 and timestamp: 2020-08-10T00:48:03.988000000Z:
                     Metric:                    Value:
                   read_time                    14.933
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2020 12:48:04 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 57.195 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/gva4waendf74i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #851

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/851/display/redirect>

Changes:


------------------------------------------
[...truncated 293.89 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-QClyFENYfICu7uenVD253gy2ve9xZdSHpSOyVcTD3qM.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-cyPxCkJPte793X5AZhoyOQZvLNBvRW6Dibp5M9vSZ1s.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-GVkVZOTSCDfkYtPgcAsuxaqrCRhtf6kEbto-65ApDMQ.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-dxYZPplAurYhf2nmP0DGviqUjJ8sFO3O9kiFQRMDZAA.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-4tU9eZqLh4VKM-aHQ1jvaexXAfd1SFc_CiEms94jQpc.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-f0Tol5SQXUFJBDxnZC11ILHQYdOoe6dD3LeB5nQB_CU.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test309840356717724091.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-X9PXu3tscNMA72KsUBGpKgndBLlE0u3cn_EAp0afz1o.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-TrBiJXLy2pBm6TyR6pIVd2TIUlueFpbRcWMavEo5Iyk.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-NhFevEyUjUami0g1ZLxWlAwniBuKBxAL1geO5Zh5N6Q.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-QClyFENYfICu7uenVD253gy2ve9xZdSHpSOyVcTD3qM.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-UT2l-qhdbogjE6J8xjBKmTmAg4Yt2QO_kDKsOD6iz38.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-gFSB9-1812xy0JmviBNaoLmZeejLxgQe4KLmhy4z-AQ.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-wqT9iixEBf2WfeW_gg1cY8QazfyRixvYjAQ-9NIEyVc.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-HnT-K4tQZY3UoTIQBo9lKVdYz8tS7Bq7Xu1lGPfnuzA.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-SKy63gviTI3uzujAcyiDV1GXBkFUDzo2chLwKs6A5xE.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-8xDBPB4gFrOokjO2mn0W5Q4q0GLmekp5apnpAok6Sh0.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-twsKQLStuN9xh6ngBQe3wlPt55PmTAkWOuXWTkmS5e8.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-KGZyVZmEC4X8OX34NFRAegqOmx4Ep-OLGLgPGfEMBoc.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-JlluJwq9mZW_72p2hXiiT9xGudyg9AQbft9mAr02EGU.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-Azs8qbIvlXKgHBRJ1xaVbk9Z0pjhuAmnYVUuDJV2O4M.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-DGmh1JxRizWNlZbx7kbY7L-C_diHrQhf38Kjg2Yw7CE.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-766mvIyUSCd05CX0C7nJUgzoIbeproImN2NSp1xGilg.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-X4dSSqb1noYVCYuafCH2LkVOpTYGcdwYVTtV-UfG35A.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-7YWg21xLxU54HQOz8JJhapvLGdSk6QTKnSmbnzoJJkk.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-xxtMyF2fp1O9tMSchECsd2Ef1kXlKMF_Y2cmvGNhy4Y.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-roNcG1J8knvwcY8vsJgDklW4o_ZsGdws3JFW9FeIFsw.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-7eKogrC3uGfL0Z2wo_Tpz-g7Hsq99z0MM4etArd3DUk.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-V53bVc7T3uukPU5K4Zg4XC12cpQ6iIpZnrNTInSqW-U.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-Dw7oF7nq2OdNsDyQ2z0YabCYJDVs8piAH_Tei-qLO7I.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-st-HIQBeEtwEWZjbKrAv71pGncjPOl1anvBVTL6Oa_I.jar
    Aug 09, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-YbWdAP1g4A14mfQd8BhzJ0nRoL_nedG5Rr4LZCJzL5k.jar
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 48c347cc8a74b51f0d29ff8436f24743f2d801319e9459f0211af9f3c2924092> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SMNHzIp0tR8NKf-ENvJHQ_LYATGelFnwIRr588KSQJI.pb
    Aug 09, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 09, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-09_11_45_21-14918365405197296583?project=apache-beam-testing
    Aug 09, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-09_11_45_21-14918365405197296583
    Aug 09, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-09_11_45_21-14918365405197296583
    Aug 09, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T18:45:21.059Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:28.434Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.188Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.231Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.267Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.342Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.371Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.406Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.432Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.888Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:29.968Z: Starting 5 workers in us-central1-a...
    Aug 09, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:56.144Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:45:56.178Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 09, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T18:45:58.769Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:46:01.545Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:46:17.922Z: Workers have started successfully.
    Aug 09, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:46:17.953Z: Workers have started successfully.
    Aug 09, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:46:48.340Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:46:48.538Z: Cleaning up.
    Aug 09, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:46:48.630Z: Stopping worker pool...
    Aug 09, 2020 6:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:47:39.156Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2020 6:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T18:47:39.203Z: Worker pool stopped.
    Aug 09, 2020 6:47:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-09_11_45_21-14918365405197296583 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ff5ad40e-8fda-4bbb-8e62-db379626ed03 and timestamp: 2020-08-09T18:47:47.595000000Z:
                     Metric:                    Value:
                   read_time                    13.489
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 6:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 40.73 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7s2ex7d3kxqbq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #850

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/850/display/redirect>

Changes:


------------------------------------------
[...truncated 296.89 KB...]
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-XiWWYLL1Vrax0DACm2tOiU3-LIbyaLjGBQaMVKBZ4u0.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-QQc1m27GXfVer5VuSnH4aDfSSQ44JvPZq0uI4ios7C0.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-lIZbIGneQh9gUtTB94D21JSQjgriy_d4rxLsxzrCUGg.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-KqjMd-rnskvE0hHj3rOOAxDHqIvjdLB9_J7JfdXC8GE.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-aFejhVsvepYhlGPjcI_-rAdqtyWhodN593KkBYdX-gs.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-zZRHITA9BW5Jl9_lt3pOYZDxh0nOaJ99Ahg--O7I6J0.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7584989297358832437.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-K9cnZRi2k5FzcqxEGt9cvi4xuE5SLurBzaVRKtHwC6E.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-phjRQlB-K63X-hO3lNRJIxm_BhUuJdKXnhXmLdXKEho.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-8RURS4_0cTU28z3-NGzYQm2e580ixss8C67W5ACAXP8.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tNdwE5AiF9Oq4OQhAmvwCf4Eb7oh2F9XoZ8G077G_qM.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-UWAVpU6gS8lO_haPLf55I4RXUixZ0a6VYtr81xGzk0g.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-ZygxfX3oXs_YG0pTChT7WPXDFTV9ADi9koOG8-77dZg.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-vIjMP0-SuUOtnx0H488y2zuJklo5Z5qCRj_T5KGxYQw.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-W1ItKAuwBdmmOMLALcim2ULcWVMB9mxTyn9bz-BwZIw.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT--CAXiiIFaq2P9f4mVaHi4pOWkDR3e2f8wsa06AROKjQ.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-Pcd0UUO8J5iOwTab7Zzcko6xYO3is7rccD_2D9oBA_Q.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-BzTUF2siNjQ3Q_EHHPCNj5BWBriv69GyFtwqQirnDgw.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-l9pdW49Zl6pOWNn-RC5gKw_GzpBEKNt4IfWwbA8zKoo.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-NG5bQ5yQgRGZPFC9N8WalL4gP6-LyuF59uwkDC_dfRg.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-JKxsN3b8vZcSxpEVJTPC_4pIvl37v8skHqT1vQJHLn8.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-XiWWYLL1Vrax0DACm2tOiU3-LIbyaLjGBQaMVKBZ4u0.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-ZhrpiSDYds_nujwLOcO_4N9dDiz-zkoKyeNzTLTzJhI.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-7cHnjUkRnkYQ0-EUd3gjU1JwC-YK_ETi-oZul3MeZzE.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-hd2trDM5nHzbSITie1GrRkTBR-6QIlT41WFDhWIWhCU.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-Y2Avb7BmP9mJgytz5TuphChuIiPEiJIbXJM-r4FaDS8.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-MYD4VSh5_oaZF26GgFbXkmq9eF2JpItRsXuRu6zX_vo.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-vntZBLhGbzBpwvMfb6hC1xp-2W78uKONI3Q1BC1LgDA.jar
    Aug 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-oWtEBa_FsJ_k6QhEqslLimdo7red-7xzTaGlAP0FCn0.jar
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-Xe-88PwZ7bqZKEOCz9Qj3Ijx7qPS1HjY760AzSQ9cSs.jar
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-3zPNJtejq0u21vO0jJDYy89s15Z467iFIF0bWiPUZeo.jar
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-P01ACGYXsS2JLbFRz0NZgneGGpmv1bQdsHQHei2ReNs.jar
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 2ed87d6c270ae0686d16b369a8cdc2bbb1cc8ced37eabb384abb748b5c9252b7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Lth9bCcK4GhtFrNpqM3Cu7HMjO036rs4Srt0i1ySUrc.pb
    Aug 09, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-09_05_45_29-1867479021149968631?project=apache-beam-testing
    Aug 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-09_05_45_29-1867479021149968631
    Aug 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-09_05_45_29-1867479021149968631
    Aug 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T12:45:29.351Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:38.558Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 09, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.282Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.366Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.410Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.529Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.570Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.620Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:39.659Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:40.347Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:45:40.445Z: Starting 5 workers in us-central1-f...
    Aug 09, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T12:45:52.732Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:46:58.133Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:46:58.175Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 09, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:08.908Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:08.955Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 09, 2020 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:16.307Z: Workers have started successfully.
    Aug 09, 2020 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:16.349Z: Workers have started successfully.
    Aug 09, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:46.679Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:46.770Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 09, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:52.462Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:52.873Z: Cleaning up.
    Aug 09, 2020 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:47:52.987Z: Stopping worker pool...
    Aug 09, 2020 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:51:07.014Z: Autoscaling: Resized worker pool from 4 to 0.
    Aug 09, 2020 12:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T12:51:07.114Z: Worker pool stopped.
    Aug 09, 2020 12:51:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-09_05_45_29-1867479021149968631 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7e625843-1c90-4ce1-9b12-a99c4aa5b898 and timestamp: 2020-08-09T12:51:16.085000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.849

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 12:51:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 6 mins 1.068 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 2s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/3gnkd7cypb24m

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #849

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/849/display/redirect>

Changes:


------------------------------------------
[...truncated 296.18 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-CHgpB0hPv3oZ2sYIcO7wsMSQ-CjNfrFJpWfGaIwTqpU.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3274536280364127321.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lZ-pGPpkK3PyY5lLxVaE4k9pVWHF_hWBnOT6S-Rk5GM.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-aEtdqETDyE_CHctIU-NZIplXCkKwOv0B4o3Cif-MCZg.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-CSL8sYgYiG-i_ItZv_3tKvjvu_vL4EuD6TUa-vwuARw.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-_HVeX2EfnQwyvybOli_DHNThqu-wUvg-buIAXGhO3AU.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-rophNXFy5j1K6EMFtumPHjqvY_kCT_q3CFY4kfI8Mus.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-vFjuRwqHxupVPIeq-pCFInaLwZz2d_rKmLMHPYXMCps.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-cCkudNCyfDS3ijZ_RTxJzqVijnV26jDZecUj7Mw5Ggs.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-CHgpB0hPv3oZ2sYIcO7wsMSQ-CjNfrFJpWfGaIwTqpU.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-vC3cH2lqnZ6v7RSU9tLwIv-zLQIt_QcPE3XlkdyYKRM.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-56natv0RR7EjtEpZd1Pa87u35WSDGs8-V_TMCFSCkm4.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-fK3yYYb8X8qItk0yVnFiJRF7jaWSpbSY_zW_AH5cKzI.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-kfGUvi08Ev3I7KC3BfR6e7imvy5hSOxZAhJyX7ZB-n8.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-ZvObagLVA_z_FVDksNy3V8QBQrnfqDYmknGFvvf00us.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-4CYXwR4g7DXEyJityML3kauzQWjTja6Dkxt_PXSRYY8.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-VlxXek30RMiZzNIPleX1co-4z-3UCTUt0cSkZDNQdjs.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-DW7W-EtC3bNnIoXHBxzBNirH81q-WEcFaXL-MBC-lP8.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-Cwq2ef86GYYMoBRHcxS7ebRHUkI8h-G-BiA49AVRb7I.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-k53ljAMnNtSzGrgeKTJH9SokGcP6NiX64BXkxJsXx5M.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-X95GDQhQYHRGnPlFfHsJuKoKDMEJPWY7PsgPryYlllY.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-MdS861W0K6MK7wsu3cJ1iSmaVF6mvK7lLTStq0jRVOw.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-g61CP-miA8O3TTRBADl5662Wj0bGOG_ijf2F9hIm4w0.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-F60XtSObHbNUL9JdoZ3B0BaF0twR79z1SHRBPI9y90M.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-vXkzpa4GxTnVmzTz-apqe3RyOL9HXMJMc14TzZWpNu8.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-P2MfHWVvd5VGQ3U2R9GuqcLu5Pwa51mrLFMDSxsf6Is.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-hdTcwknsohNqd_95FF5U-lf3ghT4yW-UgfGewDPUE1U.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-TdxjsGmHowLWKxFtIswRP28qLS_r4FesaAa7YiBC1rI.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-hvLENC24xkyaWXVEt2GCsMEQCSCyk7yol_5ZV5aTD44.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-K1W9loFyzklRj0QVzd9WuNXgf2RAJroy6c6nPoUczZg.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-9fYJ6mNwfDNrhQijf0s32gh_AoZLeZ-JVn9ZCz5oOYk.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-dAVakVBIqLzwlPuFOOY4-oZcdudrMAy16YOIyCURyTs.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.9.1/d313237180bf9f2f82e12f503d9617e6b070f792/mongo-java-driver-3.9.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.9.1-mxKxkvmYluxV-Hdn57uyt-MjjSQUsFjxFw9tjhx0bm4.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/4.0.0/9b3a11c613ec3fd3440af4103b12c3de82d38b6e/jna-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-4.0.0-2sJwtkQc4k2TqW3bbo-T2N8JkZJzh5mm9vz8KyQWyhk.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Aug 09, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 169 files cached, 46 files newly uploaded in 2 seconds
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash eb96f5c545259636607ff04ee80ee7773db8c443caa5e6148ad2a54db7271ae3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-65b1xUUlljZgf_BO6A7ndz24xEPKpeYUitKlTbcnGuM.pb
    Aug 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-08_23_45_18-10840378742320295701?project=apache-beam-testing
    Aug 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-08_23_45_18-10840378742320295701
    Aug 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-08_23_45_18-10840378742320295701
    Aug 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T06:45:18.217Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:25.949Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.680Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.722Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.752Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.830Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.868Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.905Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:26.939Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:27.521Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:27.656Z: Starting 5 workers in us-central1-f...
    Aug 09, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T06:45:54.988Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:45:58.983Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:46:22.046Z: Workers have started successfully.
    Aug 09, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:46:22.079Z: Workers have started successfully.
    Aug 09, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:46:54.682Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:46:54.890Z: Cleaning up.
    Aug 09, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:46:54.982Z: Stopping worker pool...
    Aug 09, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:47:45.445Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T06:47:45.494Z: Worker pool stopped.
    Aug 09, 2020 6:47:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-08_23_45_18-10840378742320295701 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5d4b1594-c59b-4361-be03-0731a179bf9e and timestamp: 2020-08-09T06:47:50.735000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.717

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 6:47:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 47.142 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/a5xk6h3cjrl7y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #848

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/848/display/redirect?page=changes>

Changes:

[ettarapp] removed duplicate test

[ettarapp] fixed typo in comments

[ettarapp] removed duplicate tests

[relax] OrderedListState API


------------------------------------------
[...truncated 298.60 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-_VTLI2bixD4ojwbFb-W7Gkort2YlguSdcqdPSeWJVn8.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-K4evDeyScdmy_3QlQhmZtkgJ_CNpKSSZZ1PGYt7H3ZA.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-xLvr_zDxPjr34n8ipvE57CkRhg5FAOQpyYPclbJJlA0.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-tJiHLDLYuttEkTq397JqOZZSDKWsaCOkCnreGhhrdYc.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests--HAiJNe9M9EUUm-SOp0lbG3ICLeVJCjmainyUaiw2ks.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-d0l1fX_e2etaVeIXM-OuU9FJ3HNe4Rm8HEzP9R7-6CY.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-soHif6wUoUvzTKB7vycxsousCDFf5oUGo2W_gZCuYH8.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-_VTLI2bixD4ojwbFb-W7Gkort2YlguSdcqdPSeWJVn8.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-CHPxhpQgcWXLJhZeVMgR-abURH-8ShLRXJo7iiAWkX0.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-2HZIrE6YAskzNl4tkWR6SRQte5zRifLmWMobOwxXsWM.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-ngDmf1Bk3l8uKb1j8U_6N28vrrKLuHUKOfUxjjfmN6g.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-ws58Ph4o9xHPqytXtOEuSi4NV10X0i65xYencLb-5FE.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-YAu7neAVMvUXsM63lc6MAGmUAxsI-pDUAESjOfWG67M.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-PcOKC-kc7Kc3qHWwXcNSD_B348zM04_m5c3k5i27yh0.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-C3JJvduYVkGWGS1-1IPOJhGUhdbGlkjiCP1hgDAaA48.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-TZrFT5V06TsEE7o4Jb3THf_Gr_WTHtwYLbW7ubT-2XM.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-yYagiDt_uNNYPxT5gIR5OjNfPqGJM9WTl9yBg4oJEs4.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-iqNap7ahJ6vycAzQsQetZQhDUPbUyg_-MKKq7tkGIo0.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-9RYQTSD9AWMRZ9-60s-ns2hKk4iD_eIAnlVmZ-AzvIc.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-MM84JA9fNuDzhyJJu75slO59AuPkPamHqz4EZmMskms.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-w9PIDex5v4xltmDaTz4GDr8HR3jYR_3D3OTFXyKRvJI.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-A1AunQeEI-wlqiKkwuAyu-sifvFwKpc7cP-aH19QFKk.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-olsL17KzazEI3upazyrbo_5PFPqzA6YT7HN1rPJH1oQ.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7786117281054606063.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-sHFXXwY77AMjydhrBWjCjnHn2HRR4O3wEDyzYMLnRfA.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-htu1mqT2-yh_SW94WZ6izTB4SjbePlBLG_SVn9Wv-hU.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-XMP0Hg7AEQBa8Rp_UxRO7MJAbllrHXhCOTM3-zNY7ow.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-YjYcExOum96Jk1dCnCwG22b89_JND_i7vWwm0SbZJF8.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-AfMzdbtMuaXKokAo3v6G7IBRYozZvpSQE54rIFfA8NI.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-CtXh2_gRrYbIOb6ApobYZzSXMTsZzMrRLNO3m9vEDNE.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-xm_wuXZbTdy9E-fptoYF8GKxj_thTCtHha31IOwln6c.jar
    Aug 09, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-av5rJRjefb6OXBa92NY2yaT-XK78LeN-xwddRFwlLj8.jar
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 4efe94bfbbf2aa93c3db7034946fb22e45f41eaa10497f7e7db8081883375e36> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Tv6Uv7vyqpPD23A0lG-yLkX0HqoQSX9-fbgIGIM3XjY.pb
    Aug 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 09, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-08_17_45_36-12561415528110746066?project=apache-beam-testing
    Aug 09, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-08_17_45_36-12561415528110746066
    Aug 09, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-08_17_45_36-12561415528110746066
    Aug 09, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T00:45:36.076Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:44.410Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.111Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.142Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.180Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.362Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.390Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.423Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.454Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.897Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:45:45.963Z: Starting 5 workers in us-central1-f...
    Aug 09, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-09T00:46:02.910Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:46:09.254Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:46:28.974Z: Workers have started successfully.
    Aug 09, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:46:29.007Z: Workers have started successfully.
    Aug 09, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:47:06.270Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:47:06.464Z: Cleaning up.
    Aug 09, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:47:06.547Z: Stopping worker pool...
    Aug 09, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:47:51.058Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-09T00:47:51.097Z: Worker pool stopped.
    Aug 09, 2020 12:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-08_17_45_36-12561415528110746066 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6ccf2cb4-7461-44f6-ab00-56f8f681049e and timestamp: 2020-08-09T00:47:56.191000000Z:
                     Metric:                    Value:
                   read_time                    20.006
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2020 12:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 33.534 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/nsm3kmtnte2fw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #847

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/847/display/redirect>

Changes:


------------------------------------------
[...truncated 294.77 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 08, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-OZk78AThzma1GEhfUTJVD7mKL8Jj3oJJghRqGiIWB0g.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-TaFC9mBE8UaRixvSBkQAmLc3XemqDtOL86Ze3--6Bss.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-VE7xfpYH92PtwbyPufBYszxjg5tlVdPgWMthQN6n6TU.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-Ug1A7go8qvYl9n03UOgu20NAXqakbCZt7ZkEyIGBAys.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-0qqrXTCORVs_Q5xJS1vwoJY2OJ3I4Du0Ha9kaG-DbtM.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-EHIBQv597YqHia4aoc8Z-Fx44pjpC-Gazws6J7TGcr0.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-hcp9BJX5GfmQwiQ5G8Gx6MRy-q2wrLk6bpmlUbkVZpg.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test496801314305751995.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-p8f8OF84nsSLeVhlomLwjaiFS38lE4OAO4aXHZkrjfQ.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-aTa6MGQoOp3iFmNj8QsRTeEunqTXDI6cHg6Xk5Lnnhs.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-ZBp1muGxtPNqQYSjhpuAo6YIVNvDCyq7EQ7dAXEF-EM.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-XzJryz19DGpU0JSMNNvVR9wCKbqKUTP0Qc-Ome3ctos.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-AI5Xr1vtBUSbsOpevoxqTkMj53PgNyqD_li4IqVBswY.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-zlodMFTkV1To9poTLd3-gxd0cTV67IJ8Myo0JmPAe7M.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-jEh0XGf--w4XF_VYvnMedhVwMpuhV9GKEJ6GV8_Q9bw.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-MJYUx7MQZpCNXxdPiEq5adClKCWerQe7z5zdfhe0sY0.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-dr9BsNC4esDjVHo11zj9O7mYSbQmbw3AtjBllH2uVvw.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-6LmBUVpwMob6_l3WR-uzCaCODlDxfZCoiBjo0m_bHSY.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-OZk78AThzma1GEhfUTJVD7mKL8Jj3oJJghRqGiIWB0g.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-ksM6cEw7JwXSj40DfNHWHz8I0ZcCyoWX3XTm3T50glU.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-rVEcJNDJ9irV4XLTlsVUEfNQtAysfweFhHSJxmfKM0Q.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-0AYpn8fMCqM-hN_hKcQSRDocvmWeJTrMbwpf_kbrNjs.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-14rc5lZsGbP89k_ttP2nv_PsUHE5RUn7FUEcZFakSrM.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-n1T76FFkeXOKNmOdL2yrrFyX71gUvyIcxZt678M31BE.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-FS9ykWSlu8BaBX8HprIWSsxgMcICl9_52tEjaD9sTaU.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-jBPcLiZoWCi_jcbRUnrRzPsfcC337EkxbPc76hVRvfg.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT--y7aUKr8AETceZ8DvG0bnxn3fYc8kfyK5dma5bPYjek.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-K2hxRYHvApuuvd2YUmLdgPoW6NU8HqQ1xjQS81DVgok.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-T0HROcmefrcQ5VxQw1adyprV68QaAl4naZJBp8i5gzk.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-rctvQH_rSJuYzqdYngV4M-vd-jmwvVcj0-ldRXqXLlw.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-BdYSZKcoAeRRn33OJp4-ThV-42iRH2cxihbJqCkd1zU.jar
    Aug 08, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-A-0MAAzETFW0eGYLuVRAwfyeYou_5z2naaVssddOFkI.jar
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93833 bytes, hash 5e847f841e0cf1b5e031a42d7c3769916237a47785d37b627d435fc8b59dc66e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XoR_hB4M8bXgMaQtfDdpkWI3pHeF03tifUNfyLWdxm4.pb
    Aug 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-08_11_45_16-1528062914617256273?project=apache-beam-testing
    Aug 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-08_11_45_16-1528062914617256273
    Aug 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-08_11_45_16-1528062914617256273
    Aug 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T18:45:16.883Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:25.102Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:25.831Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:25.877Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:25.907Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:25.975Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:26.013Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:26.045Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:26.076Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:26.453Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:26.545Z: Starting 5 workers in us-central1-a...
    Aug 08, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T18:45:43.454Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:45:54.039Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:46:13.523Z: Workers have started successfully.
    Aug 08, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:46:13.558Z: Workers have started successfully.
    Aug 08, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:46:44.443Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:46:44.616Z: Cleaning up.
    Aug 08, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:46:44.701Z: Stopping worker pool...
    Aug 08, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:47:39.343Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T18:47:39.386Z: Worker pool stopped.
    Aug 08, 2020 6:47:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-08_11_45_16-1528062914617256273 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4dc000fd-935f-4adb-adcf-085949646f01 and timestamp: 2020-08-08T18:47:46.730000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.851

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 6:47:47 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 43.228 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/x4fie7writ7ay

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #846

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/846/display/redirect>

Changes:


------------------------------------------
[...truncated 292.42 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-vIpu4IqpE_xinqTDmINkEWkLxwFXP0RkBWF_rapP2yk.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT--kTGk9Es35S3gcAggrmkvFfWraudhJrz9MLBZcgoCGs.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-JcpDFIUur3PQNM1yulF5PsfpEYHqc2SWw85DqExwz70.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-CvefFOhiAoVwRKIUXiLTmO9hazgQc8sk62rcZiazRig.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-9BsPkDW1_mdUHAF_Vk0H-kxBjVV4rePKufchuwCINI0.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-O9LO1VkyoGnXsJXqr5wI_LYbjfcU73yRbHdtaq0bMeU.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-P01j0ufEnvGnMuBVKvUxKEqlFigrU7f3Ue5cbRb4qyY.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-vIpu4IqpE_xinqTDmINkEWkLxwFXP0RkBWF_rapP2yk.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-53VvZWX-8Pee_hYj58FectTF3S3ZAqT4UesABftyd7A.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-xsJKFOU_MxLIe53UmbMGGbmGKbiKTNFvec25OtQFT8c.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8162504313554877800.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SP18ZrTDY2g9NGaG6MCKEJoOSMCdOBbFNuFXRyaYrF8.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-YSlpIplK_e9JXOGti16hYA4ACj0WDjNzog7LCTWbe8Q.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-05TcL5klkuErENLDmOpLCtLmb3JM1G2f8Krj67MSFeY.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-8t3jyHhFsKJLVaGDuYqkxlvc5ijUgIbBURMm-BvZT0I.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-Egw29d1mIlSSXpRQmMHAEU4-Kis9_CYKz_09rj9J9Cs.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-FOb1gumjBGz6v9jIiSYJphUgMiCmXKWWtU4s8Bx7Ess.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-ux-WtDbt_6nIyyP4J6ScEUAcHcHatwxpTovxB1StCqE.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-E5WdiXwxYZEUmt1UXyxRZmijneekTVWPkc22YFQ6HCE.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-9PBPQeyP5dxKsDWvZRxw-h-ULvaR93gGClVHz0QzIyg.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-Aia4Y1AcFa4-oLov7WFb-WYM6E0O-rHtFvHGJdi2f8I.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-R532OX0Nmsc-IrkjyBO7x9AJ5mlDp_8Pp_PFG2STRfs.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-m8Xu-JrMVxka2dFP85tNasYsh7o-a2JpnU7X_LaeuiU.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-7Kewv2AeNw9qXRfzWl264fiSW1H42BVXHkjY9q72EBE.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-7Z5M_JQHMLvqQMVMfnjTbm4489aOfISPjkMLankBnNo.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-ceA4lqrRzgXEi5Y3NYHfp6H2nP6YNAJQPiyQ8xGdepE.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-4fvZQWn66N5dT9JxnkhQtd2A0qt0R13tEBG2ju5wv0I.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-S23CrbapJpJXYifzwYoWD8oVU_hcpfVMPQlLnkRlee4.jar
    Aug 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-umZbxAnayyeDsdUpUgufP6_FJW_erPp4aBrLudErVbk.jar
    Aug 08, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-Rvi723RcbwXn61Bw2XCWYqERgCGRCEBCiYda_s3vlt8.jar
    Aug 08, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-bszL3l3rwcxYlWVHFtp2UhN-3b4Oxg0ZJWbD5QoE7v0.jar
    Aug 08, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-DasNSYlNS0xd2BTkS8s-RDyMFE6gnOSlAND6AhZcWkg.jar
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash b0df8522b5bb590ebea61372b748895c42f0eb9b30d0fc00b73e93dd3da07439> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sN-FIrW7WQ6-phNyt0iJXELw65sw0PwAtz6T3T2gdDk.pb
    Aug 08, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-08_05_45_19-6256127474837514803?project=apache-beam-testing
    Aug 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-08_05_45_19-6256127474837514803
    Aug 08, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-08_05_45_19-6256127474837514803
    Aug 08, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T12:45:19.656Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:25.938Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 08, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.274Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.320Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.342Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.438Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.472Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.507Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.541Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.876Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:27.953Z: Starting 5 workers in us-central1-f...
    Aug 08, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:45:53.797Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T12:46:01.219Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:46:16.771Z: Workers have started successfully.
    Aug 08, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:46:16.809Z: Workers have started successfully.
    Aug 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:46:48.949Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:46:49.195Z: Cleaning up.
    Aug 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:46:49.304Z: Stopping worker pool...
    Aug 08, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:47:43.171Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T12:47:43.211Z: Worker pool stopped.
    Aug 08, 2020 12:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-08_05_45_19-6256127474837514803 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e389205c-e8a1-4320-b89d-0861a0c2de8e and timestamp: 2020-08-08T12:47:49.852000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     12.89

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 44.131 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/jtan2vj7jpx5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #845

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/845/display/redirect?page=changes>

Changes:

[Boyuan Zhang] [BEAM-9977] Implement ReadFromKafkaViaSDF

[noreply] [BEAM-10656] Enable bundle finalization within the Java direct runner.


------------------------------------------
[...truncated 294.02 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 08, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-Iq6FwT7Q9Jhgp26WEX72mQ2QCu5JbZE5qOJuzLold58.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-mclI_QAX7nKA2UwpOZwQuhKOwW5-pYYG7WN5CQjRa-I.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-SKzo9JYmntu4dRHo27ulUwTpDYEE1OV8x1PGfGk3E3c.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-Iq6FwT7Q9Jhgp26WEX72mQ2QCu5JbZE5qOJuzLold58.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-1LmEFdmh8mqcCW7FEc-p_hiG6TWISGRGMxfOnULRoiQ.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-09csnMCGsJq8PQFjK4v7QKD1bbwyIZiCvW1mWF4GSt4.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-Ugxyott7NAa19IjQotfIyGNyvXN7sV8cnoaLt4yHMVc.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-0HP2ZpTlqxFDbVxFjR6jGoXxBzJcj1y-mp3-7s7Q4Qk.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-g0r7cZsZWA2nNB6d40JVFhcuuPQr7__vFySJDaWO3k4.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-Hr2mPzYUCQkhRmVZq92sscELqUyPltFwx9dz-a3ycEY.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-KUnJncPD7YV6DcqsRcRnNrWyPUdVDC6YjQfmHqCiBDQ.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT--idndU3bQOTY-KeFwzKvz_1MydBReo2sG63yt2Lrxvw.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-lbH48VLCit9nPwPILMVT0abmhudigGr_Ofm_dpD80HE.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-QEItuz7Wu5OuRIg4iteViQS6yNQOfa8HBI98kjIVpP0.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4981028871829409879.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SHxz4BKlF1ftUS2s0R3cLTgID8xagX3zgXzjwSX4QPE.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-2jWkq7X3AiXN8QeroRkQf0DxeNU5MawVmUblch2rGR8.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-5vC-ETBSqAUJqe03NAb12dyvb-KVUPeUq5OxwB-kArs.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-H20G3lUrF6y_0Nl1LwQdAvSC1RvveI8IJ-IU1AClGvg.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT--4VR28YY51xtd3A65JuoKouRHf3lpLbLT8Eo7_HNkr0.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-b5dmi_6e-wYWLCjPne0IACxnTCEgffDIdZiEMAohCvo.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-5HuyGiySvAdcdGCcQ1HEQKoBECv25MkKJcO-is8OB6I.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-ecU76Y9YNHcp-DoHfSBMbPZ12jcku5dh8qvHMNk_ZEo.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-QolSHPRYbZiZSha5muwiXIVinFZCIuHolVF7kXMx3jQ.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT--Cg3zT9SPIzY4R6OCaNtd-dpdxJrF0Vhnl12hkTpfdQ.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-EWL5CRAnfe8QhSR1TLewcgpb9EPXGX0DiYPr5V8vutY.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-WXOPJT9uZAxp7Tqv0moY1HE7pSvqwrmKeZ9Jj0iTNDc.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-hwet0COPiTYabgBNi7XDQPUqcEON_s9AWQaFUZh1rmU.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-ciCL1ebs73ZigusSzirfo3_909hydMzi83r1VR0YLDc.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-IH4GisNX8s_MmDfmse8MggY3QU6AovLL_u3ygUfctow.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-pP-eo2Hiw3HInPMQiv3RtFQ7djvmO2ZlYSDHBtbJfm4.jar
    Aug 08, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-IM7QCIlJMUMU_5KdPVtUctVdTW-cdLxQwK_nLlzB_4Y.jar
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash dae7e03bad3695dd5ab20c0864014fd6955f061b8cd57da1c2ad123dd52bfb24> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2ufgO602ld1asgwIZAFP1pVfBhuM1X2hwq0SPdUr-yQ.pb
    Aug 08, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 08, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-07_23_45_36-5766712237787413951?project=apache-beam-testing
    Aug 08, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-07_23_45_36-5766712237787413951
    Aug 08, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-07_23_45_36-5766712237787413951
    Aug 08, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T06:45:36.773Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:44.327Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 08, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:44.985Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.033Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.061Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.129Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.164Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.199Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.235Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.584Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:45:45.667Z: Starting 5 workers in us-central1-f...
    Aug 08, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T06:46:03.684Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:46:08.013Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 08, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:46:08.047Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 08, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:46:13.430Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2020 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:46:32.754Z: Workers have started successfully.
    Aug 08, 2020 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:46:32.783Z: Workers have started successfully.
    Aug 08, 2020 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:47:03.750Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:47:03.938Z: Cleaning up.
    Aug 08, 2020 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:47:04.044Z: Stopping worker pool...
    Aug 08, 2020 6:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:47:55.810Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2020 6:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T06:47:55.858Z: Worker pool stopped.
    Aug 08, 2020 6:48:05 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-07_23_45_36-5766712237787413951 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a35dbd80-afb2-4072-bcb5-306d25ed369e and timestamp: 2020-08-08T06:48:05.279000000Z:
                     Metric:                    Value:
                   read_time                    13.477
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 6:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.076 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 51s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/irgrh6rcdjas6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #844

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/844/display/redirect?page=changes>

Changes:

[kevinsijo] documentation(xlang-java): updating docs as per comments

[kevinsijo] documentation(xlang-python): updating docs as per comments

[kevinsijo] documentation(xlang-python): resolving ascii errors

[srohde] Add ElementLimiters to all Cache Managers.

[tobiasz.kedzierski] [BEAM-10662] Fix GCP variable check in build python wheels workflow

[noreply] fix precommit errors (#12500)

[Robin Qiu] Support NULL query parameters in ZetaSQL and fix nullable ARRAY bug

[noreply] Merge pull request #12473 from [BEAM-10601] DICOM API Beam IO connector

[kcweaver] [BEAM-10653] Modularize BeamSqlDslUdfUdafTest.

[noreply] [BEAM-10619] Report ratio of implemented pandas tests (#12440)

[daniel.o.programmer] [BEAM-10289] Fixing bug in Go harness split response.

[noreply] [BEAM-8460] Exclude category containing failing tests for spark/flink to


------------------------------------------
[...truncated 294.49 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-cZOMKd2y9Dy3ZisyF_kgqwGcsC-3SYmFkaXsL8UYIVQ.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-n2Tkfiwzbb3eO-C3g7m2FoHRLdK_Papxzh2eZIqGeH4.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-1hPOW0H46DTDS7pVcie0qF62zOl_mssQyaQl1YgbjRE.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-JmfSrL7v-E9A3rXOJkwmRj7c96lWUEtkON6oKflDyDw.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-mJQCXxESHF7sM9ptrAo00fsziF28MpfLJtaNBViOLOI.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-idrTeJvXl740dr4ZeRRbdeAU9eg9MZqghHWX1TN64So.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-cZOMKd2y9Dy3ZisyF_kgqwGcsC-3SYmFkaXsL8UYIVQ.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-upBKKmwiMJSX4k_wsqOJJ5iRxHd4WGcZhwdonOGra6g.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-M7x5UEj8JLNla92Ra26KEa_dN7dWGmYfDbJf3b_rnsg.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-lbu_b2HCIRCy8X-CBf8-GmtMUiugZCDJkdg4pXhkn9o.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-arQ-C0ZrHJ6mh7jGyqCUYoGCVk2xGSUx3gwKAYE1qgY.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-mD4UfN81oPNIVB-VJCwH4Tn863rcuVkEaYZnRIqDIFk.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-ttfB3JhelLZS0JuzWdnad9B396eaLrOEaa2VYBne72g.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-ZOkPDaihmjkpTiW1tmpsa-XdMX4XGlWw4XI9fdRBS9M.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-7A9M0d9GzLwFvKvBJ2ATVrrrQkq-Aofwbr05jtUyMvo.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-LxrZ2yNGDDmXap5K7duWXxI209T02o9jdDVyTXPJFcs.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-YK2km8UH8q7S92VdbbKEdkBxhWnfOf2YNnt9bjQdT9s.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-Hf4noSHB0jeSfoeWFid1F_Cb7SD3OuTrdsaZxH05UmY.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-9nbDdAK0Ai0shsjmF6pZYY4Xo0QMQbHP_t_1WHG32pI.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-OEp2Oo_vXx9F9tFo6Vs-21I1Bynpc0dxdTH7j4K0q2I.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-8xMfOSQ9Wj1251r7hBkuKS05Y2qqDUTXbbiPYUo9ci8.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7934576756413097233.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yEhDsUBmLKNVT1w_lpjMf0tY7Q4959maIBRAsdrkLsg.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-_wCypbjuISc3VdLlukDtXKmr2xn1iKUT1YEXoxUf998.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-lPvw8HHo5E6h_kGeu6vPThwGoElCcos1SToL469LE_M.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-udRD3BwPjP_4bLXr22FnpX8mLCr2YmZv1yhlSPRtutQ.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-5FgN2nTcuCaS9JbvzSWCJ7erlWZsoHJYIS-FTKcUoK8.jar
    Aug 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-1vr7BDGnXOoss3a-Yo6DiQr0BaXYobbFwABPM0UP3hU.jar
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-f4cgX1TQ0jZ6y40ePNjUu01t8q3k9uxCapOETkKzycA.jar
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-JI9z1WA-N-alIqPN7olv7DzizcHIubSHd3o0CcHMl8E.jar
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-KxzcW6n_ixU34imHV5iKaNxSu_RxYDYWIGCItALMCew.jar
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-E4-TtS4WHZh7VZnFF9Va4bbWiGcrSV4qHQ1_Hw-kcUQ.jar
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash dd92a6516bb29787977c3565ec0671fc512e544699cd3aef98cbdab11f11801e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3ZKmUWuyl4eXfDVl7AZx_FEuVEaZzTrvmMvasR8RgB4.pb
    Aug 08, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 08, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-07_17_45_32-7261476248932327678?project=apache-beam-testing
    Aug 08, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-07_17_45_32-7261476248932327678
    Aug 08, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-07_17_45_32-7261476248932327678
    Aug 08, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T00:45:32.327Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:39.427Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.201Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.243Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.282Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.360Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.392Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.430Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.460Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:40.990Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:45:41.073Z: Starting 5 workers in us-central1-a...
    Aug 08, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-08T00:46:08.520Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:46:10.887Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:46:34.264Z: Workers have started successfully.
    Aug 08, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:46:34.296Z: Workers have started successfully.
    Aug 08, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:47:09.819Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:47:10.057Z: Cleaning up.
    Aug 08, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:47:10.139Z: Stopping worker pool...
    Aug 08, 2020 12:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:48:03.414Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2020 12:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-08T00:48:03.477Z: Worker pool stopped.
    Aug 08, 2020 12:48:11 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-07_17_45_32-7261476248932327678 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 90bbcab3-93b0-4c27-9c27-381022ba8a93 and timestamp: 2020-08-08T00:48:11.053000000Z:
                     Metric:                    Value:
                   read_time                    14.762
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2020 12:48:11 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 52.4 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 55s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/wyjc2i722nxcs

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #843

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/843/display/redirect?page=changes>

Changes:

[kamil.galuszka] [BEAM-10647] Fixes get_query_location bug in BigQueryWrapper

[noreply] [BEAM-7390] Add top code snippets (#12482)

[noreply] GH-Actions workflow checks are GCP variables set [depends on BEAM-10599]


------------------------------------------
[...truncated 293.02 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-8Fw-FVyfEnyDjympjWNucafS46bDbQoxNqC8AzxX-dc.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-zb6hiadwfnIzyfQxifRYQtKeg1huYeqJEt2a1c4bCsc.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-vhfn9RLn5N1qrCTIBcjHUBhdToVtdqzKM9zHWEYbAB4.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-jvyjIoKP7CTIontWW9sFbt-wNnt93OZJLkVgai4ZBbs.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-LRa8d1iO17CroXIkSpKLXWvafrSFvkSePGhpfJpMnHM.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-wNhNelNTQq3ZOQVy4pk9MjSAkPrRQ9RWvklwsjkNCAc.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-CYcBAaTDL6Dmah-n7P2A9anxzdCzBlFf0rmgmI7o0xw.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-Y1e9wMwIssSbmVs5HC3jNG0B_V9Bl4rh0H4f_eBF3y4.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-ClhWPExa-F-Argtz8u-ps01idq5sJJ60xLm2FQjYtE4.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-JY06rpSF5aQJa5DHlp0sSgrEG5SUqfmvFiAP7_Gv3Xk.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-JpaIqS2sA1ZRaPoAK2s5DYseB1jTtrFpvjGBb3KID0k.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2861461373488073967.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-V_gjXPSSFJ_vxXFxKr6JAlu8CSJwKwhblfmGbuBegmA.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-LtyEMsfW6Kr3AcpCkBZec1_vj0fzF71xkI5pecLGwX4.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-nE1ZKaBrjv0m3NLLdbfMvXQGsZYBsjsFAqM97FMfFJI.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-YySP1lT0UamW5ru7X8SAt7p5M36lGGdBBZU0eoFD6mo.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-Zd2LTUqNQKFuy_VjLVxf-e_oIMyZ4zIcqkwzSqW641U.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-ELvvcoU1BL9m28dyB2sWHsS3qJKG_RnIH60UlkwXnmc.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-6vmcLXnLjABpQ0-7Y4nXyz4U6HjRcHMI7U0HnNKJTwk.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-vTHVKKXruNiiXPg37A5flxwE9Ok4WNdOosXHau_xIUI.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-kzmiRhjXbwQShMOVQxvL_yGnDNE6uYJC7KpthKf9H7o.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-Fme0Oy2w7HAb5XslYnG3MU963moouZO41o3gA4x8swQ.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-c6LfDp14Oc-MvJxUEhbx8-jv9HcxUGD7QjmnyEb4wVg.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-QXEA7Nj-k2l3aJV14xedjjwJYxOiMfPp0G-mVEyUumk.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-8Fw-FVyfEnyDjympjWNucafS46bDbQoxNqC8AzxX-dc.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-gaHrYX0R5QQmsTBJG7V8tPnqLAue-kERUCGrk42L__E.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-nRh1fMq4HKucQkFUkwlp2E58V5Z6O_vVoRyJ8vaK_Hg.jar
    Aug 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-G7o-qotRnpIq4n0rfe_8OeQqXTYLSaGLZoFlx69M1Xw.jar
    Aug 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-z5NFZQ9P3V8iHOuyC0_5tq9oq8Q2ivCFeZgqqtsGPRQ.jar
    Aug 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-vXEhKUkCsqdz_UvvMjBBjJIsA0nZmQwFXtGwrWTYojA.jar
    Aug 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-xjXqDS7toDOkO9_DUNP-h5CneCcNZHQykAJcAHD5ytE.jar
    Aug 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-ZvxrKUvsj3F1OwQZbOHUH29pk-ivCD7lT9Gsf1A-xfw.jar
    Aug 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash d6ea033e134774ad006d5def7bce775adcd005bf5ca5883137ed884e2921d8c9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1uoDPhNHdK0AbV3ve853WtzQBb9cpYgxN-2ITikh2Mk.pb
    Aug 07, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 07, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-07_11_45_20-13954236799954483885?project=apache-beam-testing
    Aug 07, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-07_11_45_20-13954236799954483885
    Aug 07, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-07_11_45_20-13954236799954483885
    Aug 07, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T18:45:20.694Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:29.962Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 07, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.712Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.747Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.766Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.832Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.864Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.907Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:30.941Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:31.280Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:31.353Z: Starting 5 workers in us-central1-a...
    Aug 07, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T18:45:46.183Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:45:55.452Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:46:23.841Z: Workers have started successfully.
    Aug 07, 2020 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:46:23.870Z: Workers have started successfully.
    Aug 07, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:46:54.165Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:46:54.363Z: Cleaning up.
    Aug 07, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:46:54.448Z: Stopping worker pool...
    Aug 07, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:47:43.481Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T18:47:43.531Z: Worker pool stopped.
    Aug 07, 2020 6:47:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-07_11_45_20-13954236799954483885 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3297d384-f744-4dcf-adc6-41df344e7187 and timestamp: 2020-08-07T18:47:51.516000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.008

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 6:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 47.621 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/gkbsgrkd2wxjy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #842

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/842/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10630] Check load tests as part of the release process (#12455)


------------------------------------------
[...truncated 293.43 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-u44gGbhbDqrIgf35rREvPSi-l4xjGrhQE4yMib1WESw.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-6lnCPqoSgTmApmWlkKnjL79KHTjrM9rVKKp-YKPZ7a0.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-cKm3WYP2SkGiXJnEnsR4rq1OIjDDuLxQ1fammzAtVO8.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-f0AZ1e3XWnpS8KUv4GgE3A1XDLK1fqHhwPVgnFYgiQE.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-1vGrf8UvVNNnyznJGaNQahwYFACnMwryhBm3fauLOBo.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-kpFN_P8zqTiZJv3rqdtPVusvmV8F-wyPl_SO9R4PtRc.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-5c2PK9-m7AdhfEowQT0qGIfnIXNKfmL4VK_BQWqR7oU.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-1YqO1_5LNiPENuo3_2JfdfZjCCUJuJlTHTEuJ8DSGDM.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-J5_rfnvlWyIwdUlhrmWUnxcuz3xrx4g91unDh2dWyq8.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-ZqHLeWket4kWWM0WRhmFyhOvxneS9dg2JPo871pegBM.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-_MjXNKOjPJWyYBG1dT0Xm3HziPXjtjj2ZpSvGMqkCZA.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-9qLdRTQW_jpLuW8KfMWlbQzLDUcLHGqCldgRtYrj6Pk.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-acwRVc0NA9yF-5PXTEeDO08Cr_YosqGXwsBPQROcNGs.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-9Rq8gW5a2lFyGy4PSgapifJ7U6eK1u9zmDVPiBurt3c.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4146290724428537829.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PlNNVSFpXdgYy7Ilh1WEY_BMDgiq4r7kBG2WkOXBbrY.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-QytMuqTBJNj-d0Vl1jVVlq22rt5UxjRL4OYLYFWwPI0.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-e2jazQeV9PowkpFxN4Q1Vxzh253YC85QP625PdP7CcY.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-yKRikfYOqU6200htaSOGB_qAYv5i2uBEcQBeiC-p25w.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-xaYAg-YlO1oNjihaFCaEGa5oKgDEZb6huEY-FnSi7Ng.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-gchz8R9BtH_WNUsC-mJOmWY0i7-hcoP5l2fekPtuycA.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-U51ny0p4L-mH2JuawKEhfKWs8-6Dvw_oAjBmfB6o_Ok.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-8eJ7AS1TCbesF5P7DMvFellsbBVjwh8Tl61mFGaOqsg.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-D_0AXC3udh-nEeuP1tMw_XVTz470caR-VCa2q3S15zo.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-dIwjEkFJV8iYQqh9iucnzXkbsqW6CHhDpI_ce3nYzrg.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-WuE4eEmkavc89oebqWkNfB19fIvtdqsfromvyuquXHk.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-nP6SnmuyqeVbOri2TCcnzbQ4LQZ2sEUN7Ijh5IeG3sA.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-BqVEhROR_T1TmVa0Tyu_-hdy2wqtT-6vmNgX_Qpu9-Y.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-5c2PK9-m7AdhfEowQT0qGIfnIXNKfmL4VK_BQWqR7oU.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-8nW0qH9S3tD5XVMDc8pgXrjSqDAKgjvGqQlp5QaWAiE.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-26xREUUN7W8Q4MkLiS_LHj0nVD3lSd_bbcs6BSiobQk.jar
    Aug 07, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-x7WKlExLYW2ia182ENBPoDeBjLCT6GxbZvurVtzC-3o.jar
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash c62c58836378bc5e4ed13dea71238dd715d53f65aba47d88733f18ee5e18881e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xixYg2N4vF5O0T3qcSON1xXVP2WrpH2Icz8Y7l4YiB4.pb
    Aug 07, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 07, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-07_05_45_17-9311222154891938071?project=apache-beam-testing
    Aug 07, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-07_05_45_17-9311222154891938071
    Aug 07, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-07_05_45_17-9311222154891938071
    Aug 07, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T12:45:17.185Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:25.793Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.516Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.562Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.598Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.687Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.727Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.761Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:26.790Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:27.415Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:27.493Z: Starting 5 workers in us-central1-a...
    Aug 07, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T12:45:40.171Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:56.106Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 07, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:45:56.134Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 07, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:46:01.579Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:46:20.041Z: Workers have started successfully.
    Aug 07, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:46:20.084Z: Workers have started successfully.
    Aug 07, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:46:53.993Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:46:54.251Z: Cleaning up.
    Aug 07, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:46:54.350Z: Stopping worker pool...
    Aug 07, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:47:51.295Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T12:47:51.337Z: Worker pool stopped.
    Aug 07, 2020 12:47:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-07_05_45_17-9311222154891938071 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6de05360-8c11-4075-a58e-b357e8cc6aae and timestamp: 2020-08-07T12:47:56.832000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.213

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 12:47:57 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 53.211 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/wbora2323dqii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #841

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/841/display/redirect>

Changes:


------------------------------------------
[...truncated 292.69 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-zQldz1x6sZgDYRw10Luzigy0U0EsCtfNxM0YYJLDRSY.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-UqgAknee4xYRPrNDbaFo601s9yaLnfSuTQaPpZgyYUM.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-TA2ozv6NY9pwiFXFPgll7xfO4hFZFdDUbAz5ml2Dg_c.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-jRxYsMpu5XYvY_A9sBNX1ByrTE9FI5o9-T8LRHSU3QQ.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-zQldz1x6sZgDYRw10Luzigy0U0EsCtfNxM0YYJLDRSY.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-8T65Sr4Px5GNAJ4ZMm_Zdh2lFh3czbLkz6AZ8EhVzpY.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-vzusp8A6bVK6vsR5SXU80RRrRWCZ6A81PMIYioNgYL4.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-a0Eki9bm6V0EAU73kSI_KJR0Fbt-CGpOuCG3dAZ6NH0.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-ZNh2gLpmICme2BRzj7wyLKTr7P7vpfDwbPR6GVK6krA.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-pTlUA4tKoFLO9fDv-o-PzCBahCzAa4VPTTwRdJ1BKFw.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-qyyrVPPOZhgeo6T2wyXU-MiNnaGuRSc_2stAHnszqqE.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-BtZuOjUjpRlU4lAxbb-QgkxT-dulRJCmUflRM6Q1kuY.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-Nh3FAzWPHu4CNq-qtcbpNygpKK5bFb5acY6YGrlTxs0.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-9nFqMCdLQTTAIynWUCdO9ipYMgwm4DcpM4IAprFgVnM.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-BO3N4ZvRrx4lsrrnaSTWepBTiU7cne9uA2Q91FgluoE.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-NmvA0DlWQOgHkYPRUGYvQ9g0YkMqZgb4VHfTqN69Ed4.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-niDb7o_6nMhC3a6rVKyNmCHOVJTAIkTbznWhfceo9UU.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-KpzzdP3ehl5VJNR5Cfrk9UDCQ40FCReGbNq9r2Edvnc.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5436122382691189469.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-CaZEf4SU9ssDuhGS8qfIWqMACYUP_JDsqbFQig9CwnI.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-83NbZucrA4YqVf6rcZyf9h5p7HGiJkI3KhDzAcZivHA.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-hzYtzq7Ez5BADkeWkihtBgN8qLNLVoqt26rjWJjfC4M.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-ZJjaBf3xq1VKtCQ6V9b5xTrn3Lw_4YRaC5thEIp0Cis.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-pSZ2Yj-EkDkuSmEwNSqq4OYvtaMySwO1I9rirC3UZXM.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-MB1sex_vEMftP42F_A9xWntQC4Y8rq0ieoHHyVD48oY.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-sBHhbmk0BD0uRgYXehDDsK69bYlgVnH9JnG0U-dKdHo.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-Krqd39iOvO6SV40vin1AZ4Qry9_B-BmkX1R1qK1T02s.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-uzA3MUDwRJKU0DJCRM7oVolH1b1M1QHBmXT44Umk7Nw.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-3EK9vEMEqIexBxwpHmtCjYki5oxmswRIn2y3xJqUuSk.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-tkMrmAN8hT5VZJGszQIkuoYp1CsAmdg-AlrJ01KL_zM.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-3U9ZdTHu_zdTsP3cAmMWEkHOjyPo_fFlGM8AGSv2HXU.jar
    Aug 07, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-xohaZpsuImqb6d1qaN3-OjecUbZMW3tBEQLyuCSGnDg.jar
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 5a4aa9770c34f18b32d1ba2e3f5f5c73b97fb371c388e6a9ebbdc2a29eaba7b6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Wkqpdww08Ysy0bouP19cc7l_s3HDiOap673Cop6rp7Y.pb
    Aug 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-06_23_45_18-4640783247973954789?project=apache-beam-testing
    Aug 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-06_23_45_18-4640783247973954789
    Aug 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-06_23_45_18-4640783247973954789
    Aug 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T06:45:18.873Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:27.673Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.470Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.512Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.543Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.624Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.652Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.686Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:28.715Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:29.150Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:29.213Z: Starting 5 workers in us-central1-f...
    Aug 07, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T06:45:37.762Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:45:53.448Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:46:14.404Z: Workers have started successfully.
    Aug 07, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:46:14.436Z: Workers have started successfully.
    Aug 07, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:46:54.511Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:46:54.703Z: Cleaning up.
    Aug 07, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:46:54.793Z: Stopping worker pool...
    Aug 07, 2020 6:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:47:48.891Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2020 6:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T06:47:48.945Z: Worker pool stopped.
    Aug 07, 2020 6:47:55 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-06_23_45_18-4640783247973954789 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8d647628-a2e8-4e8b-a4ae-d13898ef88a4 and timestamp: 2020-08-07T06:47:55.579000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.355

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 6:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 49.972 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/c5fzuc6wlspgy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #840

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/840/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-10135][BEAM-10136] Refactor jdbc external transform registrar

[Rui Wang] [BEAM-10633] UdfImpl should be able to return java.util.List.

[piotr.szuberski] [BEAM-10543] Add new parameters to Kafka Read cross language

[piotr.szuberski] [BEAM-10543] Add new parameters to python wrapper of Kafka Read

[piotr.szuberski] [BEAM-10543] Modify Kafka python cross-language integration test to use

[piotr.szuberski] [BEAM-10543] Run Kafka cross-language integration test in python

[Robert Burke] [BEAM-9615] Add initial Schema to Go conversions.

[dcavazos] [BEAM-7390] Add sum code snippets

[Robert Burke] [BEAM-9615] Improve error handling on schema conv

[noreply] Updating changes.md (#12424)

[noreply] [BEAM-10645] Create context for allowing non-parallel dataframe


------------------------------------------
[...truncated 295.35 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-e3XdJgV7pP6Be6GeIWfWIaL3gAmshEcP_1-7Z6kHOug.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-YluSFH459N-zYg0Y1h8Y0Ti3WJImf45vwRbjlV0Jty0.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-rOIHpCyqYLfoZaOLmRsutsVWD3k-sU1eYDy7kq_pC5s.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-i0cnvqDC9wnVG2GcDt4SI8E9NTXiCamm-pLR9-1pYk4.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-DqvmAnkeaMckLEHvjntAPkU_1YTcuvpiNLV71gxsMPg.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-Hv3tR9t5drPh0lmVpi_jTW6kFD6TmKiifxs8RlvFfOg.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-qwgDz0vt3j1E768mPr0ppd30aLdN9Z-Aira95iYCBHU.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-3gzmHEq1f62aGSDMLtECBD-T8Lw6Uih7SsL4oIxFrHk.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-SA9rYlLwcnls6pplPeN4MXsFkwVISN8YP8w8z2C5iyc.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-T--opsw9SPEjkoTam0NfmzGO_Qfz2ZkYUl7di1xUc44.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-A1zOnqB-NNk9RkEUHb0Oz7QqyTi9xVq1VwBQAJcgm9Y.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-oL3QhL_JP0x0cL5fQBBMEPL_WaBG4FQP9mvfTwgSjog.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-C5158hrgoQuvS9LTlDlGR92dwRc_Js6Opw3LYf2uXPU.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-z1sx5UGIq8GQKaW0nNzLaeSdCV9yf-x7iqzSGOgItuc.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-rpOujVbyqdUN2mor6eZDIIBergRJsLewPXyup4S_92o.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-Oy2mC4Q14dgkUJ9VvqM_rRlXs5KQT87svZhHKPm7MUE.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-r8man-GKsUnFbFwS9S38eRLT-FmrFBDLbhwQd6caJzc.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7020461626296435774.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wgciOsnAZTGji2TzQNyTtYeWttBS-nOYOGmn_nuVkac.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-iqMS2dZvg_W0SZ5IHQekKcmYVa9QeRTIYk8sh0i5ono.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-XzoESrpUdelGf7neKYo5j_2te4XjnQhFsTlbSjD8HMg.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-e3XdJgV7pP6Be6GeIWfWIaL3gAmshEcP_1-7Z6kHOug.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT--4OS5Mo6ymjEcsgRPcQO78Gb1P_OQiWUtsFUybUZikM.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-byc7t0b5ISsimRn9HRS9eKn7Hb33-_tiutb4v_N3Lts.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-VHm1YjJdtYVWVcBHuwVcWyKOyHl6oMrkPq8cwc-Q6bk.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-aFGbjLr0QiEuy2rJlpTypHC51ulTHfbeROpoaiuyXu0.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-y_lfuOAeTlO7lpeOl39Jm6_9UUi8ZzTwknUikUUi6Os.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-hWaG4nkE6WfYFtYsFCNxPKMwGiVW2MyZ0XOtCRUpsfg.jar
    Aug 07, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-AkOJBqnwdzhdwFnIn4lg4oVNsI2aLMS-_4Pb_CdSPro.jar
    Aug 07, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-JEH9dT3O5ZPz331_06cERvJlLD4DaBH6ugyuy4s0eZU.jar
    Aug 07, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-DOIMEu_ehoGiAmDAzSrHlVuSp00A6fYwTtCkRFXVxEs.jar
    Aug 07, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-4DrUd5blb2J7MuwMyiUdJZvsHlCoMwBSXK8bhZadV_k.jar
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 77d2ece79a79b0549ce5f6144633463eb6cda12969a5f2c8145eb5c8d47b3fb3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-d9Ls55p5sFSc5fYURjNGPrbNoSlppfLIFF61yNR7P7M.pb
    Aug 07, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 07, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-06_17_45_36-11115950985384372719?project=apache-beam-testing
    Aug 07, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-06_17_45_36-11115950985384372719
    Aug 07, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-06_17_45_36-11115950985384372719
    Aug 07, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T00:45:36.758Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:43.943Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:44.854Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:44.890Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:44.929Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:45.002Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:45.032Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:45.067Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:45.098Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:45.458Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:45:45.537Z: Starting 5 workers in us-central1-a...
    Aug 07, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-07T00:46:17.454Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:46:23.588Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 07, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:46:23.625Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 07, 2020 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:46:28.992Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:46:44.714Z: Workers have started successfully.
    Aug 07, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:46:44.756Z: Workers have started successfully.
    Aug 07, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:47:23.365Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:47:23.574Z: Cleaning up.
    Aug 07, 2020 12:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:47:23.664Z: Stopping worker pool...
    Aug 07, 2020 12:48:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:48:25.984Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2020 12:48:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-07T00:48:26.036Z: Worker pool stopped.
    Aug 07, 2020 12:48:33 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-06_17_45_36-11115950985384372719 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8dfb08af-c7e1-4560-97f3-857875e5a0de and timestamp: 2020-08-07T00:48:33.635000000Z:
                     Metric:                    Value:
                   read_time                    15.878
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2020 12:48:34 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 3 mins 10.878 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 18s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/oahjr5aiddv2c

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #839

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/839/display/redirect?page=changes>

Changes:

[douglas.damon] Add Additional Parameters lesson to Go SDK Katas

[michal.walenia] [BEAM-9421] Add Java snippets to NLP documentation.

[douglas.damon] Edit lesson name and task description

[douglas.damon] Additional edits to task description

[noreply] [BEAM-10258] Support type hint annotations on PTransform's expand()

[noreply] [BEAM-10522] Added SnowflakeIO connector guide (#12296)

[douglas.damon] Update stepik course

[noreply] [BEAM-10240] Support ZetaSQL DATETIME functions in BeamSQL (#12348)


------------------------------------------
[...truncated 299.27 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 6:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2020 6:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-H2miNQpilgNu00Nsv4JrVoR8L3O-3731-f75jCfe29o.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-sK9-moiNIU-UELIuMm9wvL3sFFWpSw0EelVQ9vtKCQk.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-Fi-9i9zgaB6b43nd2nFkxA-gXyj6TziVSbLOF7xPocM.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-HtJnPRB_2zokcfSJJgVCYEq2WRLRArjjQGJxG0c-Wi4.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-tUmSQPOXzx3vS_rj7L_6wX8J2TsmE99eDTP0XC6vcC0.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-uBfb4a93-ZEOg9mRx5PwR-hKQ1ZhBH8Y4ivAVriUhwA.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8266196972252162608.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-It4Nyg3oqH5QK7ajeg0jCyqy8Nz-Ilhnw7S37Ck0_Jw.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-8QCeAF8ZVfr4q4mDkU8lVaAPxDi6lwSnR0eKzvtmsIE.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-iAdw4caY9EBQm7y17EZA2NZxmQuc-m2pUO5AIMJ89X4.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-Zpv4rJn-xLEpdUwPxcmkH75ZZZTnAwKZpM4oooljoBI.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-jiD61KLGNzx0nkw3t9keBoOKkjPe5YSltJT3wO1I6RM.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-Pn_b4KJUj-AEXV_-yJuzCg4zxfsfnrOuJV49VVMd2eA.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-YAL3KVS5nSDprIHsWZsY3rs10gkn0-TIf8zgzRjM9Vo.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-wwdOcnCNgtVMfF8STBoXtYavT3obUkAISmBP-yvjqrY.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-9f_qpYIplSyosSc5fB-MtBlWnVhCCH5jwttCzQtG20Y.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-x4TNtg0M_WRs9UvXV4Wp_1Ov45yHKd15X5BYwTfOq0A.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-HliuEfssLGQc6JfmZ6wh6ZsqWnyPlA-CVRNsjqeHTA8.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-UtNBSObuD3lj62cA3bYgJkl9L7CU_Er3tw9xq8gfbo4.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-u8BrwahGE8jq4T0aqjFYPAU1umXg1p87Hm-Mvl0Ytdo.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-XZ-hIXDgCwFbpveHj6a8mkbft9Ec3Iy_9O4aO5CtPBk.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-09RI_OwvPH9lShiiRYhCXu9JfJPePYK0QiBglwdmgT4.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-H2miNQpilgNu00Nsv4JrVoR8L3O-3731-f75jCfe29o.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-eYMBh2wgiU9j1TBuHdoatZo-kcMK8azGVCL0GdSg3aA.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-zMlaEoOYUYdn3IWav-DGLpOpSa22TyKiwL2ZMSjbQXA.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-k1wdUFQvop4NmrTxmUAaO9Szy2qjr-q6x6SbEmAc2ok.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-aynYWW5iZhhyWeJ6haM4pYlma8qJVt24JfqkABWIi5k.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-K1-_Eb4odZqm5jkjfr1ucYkX_REDHamp4QxJrwSkvpo.jar
    Aug 06, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-BHjTOqUNgt-VhSQGVzig8RSUPZNVWrlBQkadcipjiFk.jar
    Aug 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-hacbOC6OlBPwgi9wVNdDzHt6n2WzjiixBO0Xs8kDxgQ.jar
    Aug 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-KbE7aJpqKBMqqQqz29nynMr6rLp-UUvUKHm9cMuXitI.jar
    Aug 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-WuJ6F9K_2CPrLLZxEnGzqmMxm0oe1mR-xxUKnENK3MM.jar
    Aug 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 5a3704ca3a82a741fe6543ed7c068ba8632e69c9c675415fd3566da923a0d19d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WjcEyjqCp0H-ZUPtfAaLqGMuacnGdUFf01ZtqSOg0Z0.pb
    Aug 06, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 06, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-06_11_45_40-12987485498508924587?project=apache-beam-testing
    Aug 06, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-06_11_45_40-12987485498508924587
    Aug 06, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-06_11_45_40-12987485498508924587
    Aug 06, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T18:45:40.408Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.021Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.745Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.790Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.825Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.899Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.940Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:47.966Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:48.000Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:48.486Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:45:48.616Z: Starting 5 workers in us-central1-a...
    Aug 06, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:46:16.815Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:46:16.843Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 06, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T18:46:19.573Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:46:22.160Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:46:37.287Z: Workers have started successfully.
    Aug 06, 2020 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:46:37.314Z: Workers have started successfully.
    Aug 06, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:47:11.323Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:47:11.510Z: Cleaning up.
    Aug 06, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:47:11.586Z: Stopping worker pool...
    Aug 06, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:48:12.299Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T18:48:12.336Z: Worker pool stopped.
    Aug 06, 2020 6:48:20 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-06_11_45_40-12987485498508924587 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5a1be745-de4e-45fe-ae3d-11e8a941c223 and timestamp: 2020-08-06T18:48:20.177000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.209

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 6:48:20 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 54.027 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
106 actionable tasks: 77 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/wktmphie2vsfo

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #838

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/838/display/redirect>

Changes:


------------------------------------------
[...truncated 293.64 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-fXGbEPHe5mhvr2M7hfJkpK4uZ5vvi0DRMGF7NeyCPh8.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-eVUenyOagvqIYKdox27P0T6bXb-O_yBx4rqFE8IraLY.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-KGBv3G_Q0_iSwILTjxzI_IBiuGWM_KLS0Egoq5N5EDo.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-QCT21JlpfbIMka-8lxDQOWVBSYyq3HBqtdeoWNJGFGE.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-Yc7XV2ptfGjV8g-S_qWJP4d0IrQKOBlkxunpr4xdQ40.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-V8BOAkpwDShnEMod76MqyoaK6LtylhfyMKn_aAscShg.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-ijVbdvG-9B6ZTqdiBcELdQ9yohLieIg1zdRnykZPD-Q.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-GtXoo_9LqVBkdWyGnyBEt3cdhSshcF60CVP1F0TJcRA.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-7KumYkGJVD1DDWJ8UKNh6jy2Yezuhd7kapE3iO40D7s.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-lAijZA1weojdFgNhxVZE-_u448ANC3-gmteIEtk8yCM.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-GIYZekNjfk3NMT__XsHkxcJA7nYYC3KEVxCrPt0aRyU.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-a8prFTo-V9Dieu-USAsmb3uDUO4Z8c4tldM01JcwM_s.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-YpfvwajQqqUU9FktnYLjwP36xQ3qHVMGBMJUONUwi4o.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-MQyWzXDeooGtTa7g5Fqo0EJIo2Wyr-rvhwlK_sBAbyE.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT--wqBxjW3J0uZuAgX67CZ88fyl-ASHi7p0iT8TNk8Luc.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-kwZIvd-c_rlgZ0AmaRFKhW4uykQ0Yl5q5iYxXfH0_8Y.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2120604030499682407.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Mr1nLS49NThYcT5GLmLyuNTP88oTRX4AGPodJf1j5c8.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-HVPMOJMmobfrZnUvOHUV3j42SShVif45idV9ZLIqh7A.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-LyP_Ok4hmzt6X_SrZF-b6tkdMFMze9HUjG62F55ZgUg.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-Wr2lO2bYnVbTVs5l5upHyveUrnTz8DdTAzBGavXkKYw.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-1gnPmF4EuO14xvU6Iai4EYNnDoeo87q1zGLIB3hvUMg.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-WCGVxGrb46KB7mxyLG0uSeJCWxDIJVQKn_gIcgjwi6o.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-Tng5ViWunYMY8vAvzv-f-ZroKNW9gtKOXg2XhaVbKOY.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-BbS3WwgIftlmVlCa-kurqqfR5jd9g8maQxvYdzny4EA.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-fXGbEPHe5mhvr2M7hfJkpK4uZ5vvi0DRMGF7NeyCPh8.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-ax0DYirQqjYC1ysOCxz6jvI6NIVqqo_NEXS4i_BA_tk.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-w951B-gXEd54mGvEWIEIZvMAEYb_OFJpi6VMBM9wVRQ.jar
    Aug 06, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-p61NgfMvI0r2usRSWS7oC_OeFnQbGYq72dwIg12n3xI.jar
    Aug 06, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-SY2dPyVxq5meoPmuBog2f98Fu-ccOkzuXVJu_GtfHEw.jar
    Aug 06, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-Pl_ArSzzWa_MsTLRSASLxHzYSaad3EiRbG2Uk0MWcxc.jar
    Aug 06, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-CAoI1K9Ebl92s2XeLtYz91Hl-0bVhN847FXXscsSIPA.jar
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 2 seconds
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 3818fda6110dc3cd4d55f5fd3e663823ac036137389e96b8b2cf4ec3966da216> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-OBj9phENw81NVfX9PmY4I6wDYTc4npa4ss9Ow5ZtohY.pb
    Aug 06, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 06, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-06_05_45_18-2438885846867638208?project=apache-beam-testing
    Aug 06, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-06_05_45_18-2438885846867638208
    Aug 06, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-06_05_45_18-2438885846867638208
    Aug 06, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T12:45:18.046Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:26.505Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.219Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.287Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.327Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.394Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.424Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.459Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.489Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:27.928Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:45:28.006Z: Starting 5 workers in us-central1-a...
    Aug 06, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T12:45:48.460Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:01.753Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:01.792Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 06, 2020 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:07.125Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:23.136Z: Workers have started successfully.
    Aug 06, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:23.160Z: Workers have started successfully.
    Aug 06, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:51.602Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:51.813Z: Cleaning up.
    Aug 06, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:46:51.890Z: Stopping worker pool...
    Aug 06, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:47:40.080Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T12:47:40.131Z: Worker pool stopped.
    Aug 06, 2020 12:47:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-06_05_45_18-2438885846867638208 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): feef2067-6986-41ce-a956-00125cce8708 and timestamp: 2020-08-06T12:47:48.037000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.144

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 12:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 44.046 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/blkjnscxziamg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #837

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/837/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7996] Add support for MapType and Nulls in container types for


------------------------------------------
[...truncated 292.92 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-ZOekYV_yznfh42sgPR6IfUqrwvwzyHQwzDlNCb0gOvI.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-Hy1Mrt3m9rDqHtNSyygKDcWhfniyWfJpzwvtT8f_f10.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-YM58RRsDUyDiFO3NXepnBblIlwvTDQEEyBtKrnr-YV8.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-O39GiIhty_QQjcmcK0V_LpYe74duN4BtSuEGRTJSFPU.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-ou4dNXXmNacyVvC2eqDJOhBFAPZQztq2K8IZ60fQPuc.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-3kPCeh5uEskv5pHX7THAjN2RynSP2kVL-bA6n2rbGW0.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-8rWav1M5I_XOPRxsR73LP1u06mkVOb0K5sqfNRhEwEY.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-6trePQun0vtz9edhgIb8uwDiSp1X-RO6TE7ZgnHFrLs.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-l87fI_oo6aAbzoD4OhVQVvd8wjcAy6ZLw-v52AFMXlc.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-HooXQAy1l0NVeiijRsWctxed891-tTe-LS8IhItirQU.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-Fs9x5pXoJgyMQmvtkipmOM1HPep7RxpinhJukFipu1U.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-MT1W7zCAwekJTGHQN0gx55BLXQD3I6BOyuPfq5H-x_M.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-xrCJb59JM5uuMnquP8JZ0mTJ2hdUs2YV4mgEGq4t7Y4.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4627780284168432891.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ss92m5hk7uB5ZMZQUOpR0KasO7QdBhE0QlE58Hlui0s.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-lXXFKiqYLFK0cikhWtg_qbcmOd3xmSoSJ58AkoVFIEc.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-xVGPoTqE7ux7-xAAqB5y0HsZIIQRgbdSC-3nEsE8gHw.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-H7Ff01gqJf5jemWQK3N63tYU3KBAbhtwm5lV_n68Z0U.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-e2trM00e1tIVzRrCK-WC3nuuZL8YD4yHhSzlpjNkSvU.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-p4Z68di8tijskZQLzD91EOrQ5M55c2qZOSFUiouK9kM.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-ZOekYV_yznfh42sgPR6IfUqrwvwzyHQwzDlNCb0gOvI.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-wVKgpv8FkpLq1Dhej3M1QnZ8-DZJ2Zu3Li6yh3Ue_8I.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-Tn4iauoxS87pCTr-Ao6EOn1MutkD7-Ilf2WA5dSb41g.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-zIQZRvuoBEnLCZUPADxjYoRQl1wmb3KFYSqVf3mJ9DQ.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-I7ahkU9dFHs-in-bJYLCl7Mku00uGuP1ssOUhcX63qI.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-CH_4xxznKxRsG3mNh4THZgvCy4AGZ7O9fpTUTxasku0.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-jrsgnGvqMPgIc6tNhOjIDBoULyFdHX4LO1k5OrlOHR8.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-F7t6Fw6FuCzh8URNnsZKS-DYmYRU2-ZRTjCfymEFaoU.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-ilANHPIyOkuyuonGBudwY4P-veWB1DwvfOJlO6gi6k8.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-Ee2Ejrlv5J2GbSIPKlt7ZRIwVDMF4IemXRC-cl5uSZU.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-AxuRKX5xXFItfGHjxd7ZkHwSFl5P2P1-vg1TVilvmK0.jar
    Aug 06, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-sVcVSS9uWvX7SrYSDC_kMis1snMiL-gelRy2slwWOMc.jar
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 0d30e7d2b9790436fb4061773c629d9ddc3bd38e1aa86d151911228b751f4c1f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DTDn0rl5BDb7QGF3PGKdndw7044aqG0VGREii3UfTB8.pb
    Aug 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_23_45_17-11460344601115151783?project=apache-beam-testing
    Aug 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-05_23_45_17-11460344601115151783
    Aug 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-05_23_45_17-11460344601115151783
    Aug 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T06:45:17.060Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:25.579Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.573Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.606Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.647Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.720Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.754Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.789Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:26.824Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:27.632Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:27.791Z: Starting 5 workers in us-central1-f...
    Aug 06, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T06:45:53.548Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:55.110Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:45:55.138Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 06, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:00.535Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:00.568Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 06, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:19.908Z: Workers have started successfully.
    Aug 06, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:19.945Z: Workers have started successfully.
    Aug 06, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:48.771Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:56.679Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:56.968Z: Cleaning up.
    Aug 06, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:46:57.056Z: Stopping worker pool...
    Aug 06, 2020 6:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:48:00.113Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2020 6:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T06:48:00.165Z: Worker pool stopped.
    Aug 06, 2020 6:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-05_23_45_17-11460344601115151783 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8cbae84f-a78e-4dd1-87e8-c9385f477393 and timestamp: 2020-08-06T06:48:07.721000000Z:
                     Metric:                    Value:
                   read_time                    15.923
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 6:48:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 4.171 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/jm7wpt4vqyhgq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #836

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/836/display/redirect>

Changes:


------------------------------------------
[...truncated 292.74 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 06, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 06, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-EAVCbu9oO9ATsbmuXkh11DDy8xhjPi0IQvnsdrKsqNI.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-Ady0PMCd4Fet1wOukjwXS0did8RM7kdDBjbJ1JPMT2I.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-DutbySKhUp1dSqmiFwlb_bJn9qI2CondY-TjaYKEtow.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2082081859352873861.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-j3TwxVmqTxdv-snLQtbFo8w-2bk-mgrYcPWGgaOYW5A.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-xyRc3afJTppsLdBvxg72mSijuxcmt-2iDcrOvVkVOsk.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-dsMRaAIWWjcSQgV2WI-Mpv22_3Uj6vvIiZ9mzY5UZh4.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-38hruTS5P0zc0m4IhPEFDG29VrE6vAqsMS9-GKslZTY.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-Ljic23tuzlNShc50opWLLVk3mWsQG1ajzKLYXujgy-A.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-rHxMTBqeziehdVUsvR4zXzli8GeNJsLLoiSA4nk0BKM.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-AugxDDqKmgronwPpaycp3V-o35p93dMi_KTC9s1dHlI.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-iRs9ssM2AkKiOlqgJNJDtUa7qT6tqbwbE8iJQWG_P6w.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-iu7VXw5435eTm3mP9JnHgwIV5xK5GAGnGM39vZtgmfo.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-kxjnfb57PTkYzjJvGMuXfWwG9TxBvM7rv4vC4rGrVSk.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-EAVCbu9oO9ATsbmuXkh11DDy8xhjPi0IQvnsdrKsqNI.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-8SPt0EwBqheznj8nZ5HDqR90ziSRIt95Ihyuq4Uzgzg.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-q0ZOuFZCyrKiUPk5_-xe6mC9xPJojJwsOc84uec7WOw.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-P8kkI3oUBAMWXh0cY56YggciwkZKbsunLcqm8tSLMBI.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-qQTmvi1n_JR5mLfupJFhx2WQwuUHC3gCfg70Q2rIZ3M.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT--rLo1q5Y81XUGk7RkKU9vA9nVGmmXZAe4GjFBQgrFP0.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-NSqq3Mk_SnjrznlOYUsl1ZhowJ7_1SCQ1oyYmCYah8k.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-kQRjqKHQCzKSkWvd--qj6ZJWLX_ztwiDOCQU04idmms.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-myC8lH4E6PBu1r38B8t0hyzf49eDSXgiFgNTkSL-EgM.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-96c2BlkX3yERj7Iypkm2C2dgXkTDiv_nKwEKHyqY_zA.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-xC_2VRTzzFaty7c2H4TkkLAb2cqRk-xgGOqPbXrujWE.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-TUFRDW5i44lZXtGz9xxbY1DQ0AxjPRZnFdmrGCR1b7s.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-bqKWJwFvIELGrZQXgL-2xPXIS9q5dCoXp_pO_Tk59nc.jar
    Aug 06, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-GuUOXz5Z1y44Ajlp9Qsc5Y8rr8GCt9_pVM1UMKUV6EQ.jar
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-fYTbvWLrPit_8_d7P9OwhAGflwrX3ep9Epn8aEPRp10.jar
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-xGeuBPCYmh2Asf0wrIFvm91wYiLlKV9NsvD6RH7xKac.jar
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-FmoA1WXeFRbdGLnmNnaailBHsLiTmY2b2UXueZRyiF8.jar
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-_OX1i0KpeNOrWYBjZ1RnaDaCieXOgvPKaFVzFzbfFHA.jar
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash e7a666e1b10c871510586ead8b0009ba60d2eb6a63e7acf1faefcd1976f8a865> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-56Zm4bEMhxUQWG6tiwAJumDS62pj56zx-u_NGXb4qGU.pb
    Aug 06, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 06, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_17_45_19-14372835429266464810?project=apache-beam-testing
    Aug 06, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-05_17_45_19-14372835429266464810
    Aug 06, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-05_17_45_19-14372835429266464810
    Aug 06, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T00:45:19.222Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:27.600Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.591Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.629Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.668Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.742Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.788Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.824Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:28.860Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:29.296Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:29.384Z: Starting 5 workers in us-central1-a...
    Aug 06, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-06T00:45:37.955Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:45:54.225Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:46:14.728Z: Workers have started successfully.
    Aug 06, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:46:14.771Z: Workers have started successfully.
    Aug 06, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:46:50.106Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:46:50.337Z: Cleaning up.
    Aug 06, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:46:50.430Z: Stopping worker pool...
    Aug 06, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:47:32.161Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-06T00:47:32.201Z: Worker pool stopped.
    Aug 06, 2020 12:47:37 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-05_17_45_19-14372835429266464810 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 75f05728-7de6-40a9-a3b0-139cd4e76b8b and timestamp: 2020-08-06T00:47:37.722000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.526

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2020 12:47:38 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 32.701 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/hc5fzuzdtd7t2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #835

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/835/display/redirect?page=changes>

Changes:

[tysonjh] Update google-api-services* versions and google-client version to

[tysonjh] Set orderingKey field by method, not field name.

[Robert Burke] [BEAM-7996] Add map & nil encoding to Go SDK.

[noreply] Merge pull request #12149: [BEAM-9897] Add cross-language support to

[noreply] Merge pull request #12151: [BEAM-9896] Add streaming for

[Robert Bradshaw] Simplify common patterns for pandas methods.

[Robert Bradshaw] Use new infrastructure to simplify pandas implementation.


------------------------------------------
[...truncated 297.86 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-UJM03yJ8OvqQe94vCneWkF2BuZYtZQMyllTI_Bz4gro.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-MbjKhpTgGDg9aHCaKgxHWooUh5940XaqdEWiOXW49a0.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-SSpaMfn5VOKh_cbBG3RXbVZph0J5rE9uh6qdOO4_C8s.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-Jf9EGMF52RSmB32MdY6Q6FonXggwMWUWXM4kdylblqk.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-B605mk4Y72VPxk0PsRcAJq045g_AfN3R0ledAg8qAg4.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-ia9EFfOkPNAowZCNT0XX5UEbPZujmtFFAh7UVAFyCNQ.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-jfYtLOrU4xM0qUvFE45WBdwQHtJFhCVwtxBrlvUPzxo.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-R4wkX2n77tDFXeoe_yn2_u-nsSMqfcHlCJXa_vUJFnk.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-1sPOFy1yuPxnK3H7jrspY0B3rVa6-3RA3-kkKU57QDI.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-nSl2kYorxxDb4505g5AAzu9U4VvDxEvgRsg2Uu__zcQ.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-WLv2urjkrRVx3inyN04Qh241YqMh_VFHCBjOSWMWVEQ.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-ZddZ0Gob2_Y3W4k8bEMhh8vbk2uyKkJekylaU8I55-s.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3288844064423441124.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Jf1rTbQQlo0i6GGeJpYJeV8F8SEQBKUusbdCEmmoD5w.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-klna0fetXtMmN2voZRosf5Y0WasONOFxcKy2jGrv0Z8.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-UJM03yJ8OvqQe94vCneWkF2BuZYtZQMyllTI_Bz4gro.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-A7luG4ZGNKflyO_LM2YVhMs_6rjOBkUPFQ_gRi4u8go.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-idZ3mQGaUI0F9spGWNTLVKYLf7nGzPxLfnKJCvaLers.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-lO-vCgeRFMP7x7g2XtXG0-p0sCKHxqrl_KIBcOYOKcU.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-82k9oWo4LcWNtVlNIvldvRNVg575uqMpQqV38HmOhV8.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-u9rpR7aufs0t4LAulDynzE1HT7wwzH0KQyoLJ6jW5qY.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-a9kxv2tvJKSUfzfvjYG1hnCBfrm0wZ_6K2HknvGjUYM.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-AziyFpFPwvR4d5cWQLOBlih8JJTVwtIc3r44XjR8Vwc.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-HRCYqNC_g5J5-rksiYCdPyMeZ_qeh-rLodweaKaArOs.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-_xt59Gy4aWdTIDswC9e189p_c6VKA8XdR_9kwZXjx2w.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-mPAtV8JwPa_fYNa2sVLHhP1JU4FTjDCfsZ_Hzt2ztaE.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-LIF-IWLv1lcqUG_MlvhWUj96SuVAzrfrWIE205ZdFas.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-NOLjZNKYiHqWyS97n05tymO9x6NxgehGgg27e7MGG5g.jar
    Aug 05, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-cx0AcAVbcv5Venw12pvYyu1QIqhZRm4pAYxRjcpU4hw.jar
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-wbOWkgiqtPqBoqU2IY_1kuqDDNFbyvCioEs7Kb-_ybE.jar
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-_uWifFOrL9w1BhcEC43SOpBU2mBnaJ6qHF8emRpSzFg.jar
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-nv6URdalxLQMzGDk7um6inJNpiPrg-Q7dZL7IgUKTPs.jar
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93834 bytes, hash 1145995f0d3c54dc8c530f20b37f6f8ad24cbc0e8df2fde2c997f351037ffa79> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EUWZXw08VNyMUw8gs39vitJMvA6N8v3iyZfzUQN_-nk.pb
    Aug 05, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 05, 2020 6:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_11_45_37-17601796880294578908?project=apache-beam-testing
    Aug 05, 2020 6:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-05_11_45_37-17601796880294578908
    Aug 05, 2020 6:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-05_11_45_37-17601796880294578908
    Aug 05, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T18:45:37.469Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:45.382Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.300Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.338Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.364Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.514Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.567Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.606Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:46.650Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:47.133Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:45:47.212Z: Starting 5 workers in us-central1-a...
    Aug 05, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T18:46:13.261Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2020 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:46:37.707Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:46:58.035Z: Workers have started successfully.
    Aug 05, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:46:58.059Z: Workers have started successfully.
    Aug 05, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:47:31.039Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:47:31.221Z: Cleaning up.
    Aug 05, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:47:31.299Z: Stopping worker pool...
    Aug 05, 2020 6:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:48:25.206Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2020 6:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T18:48:25.251Z: Worker pool stopped.
    Aug 05, 2020 6:48:31 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-05_11_45_37-17601796880294578908 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6d5486c2-e425-4b80-ad4a-a4a357f668fb and timestamp: 2020-08-05T18:48:31.466000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.011

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 6:48:31 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 7.452 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 15s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/gpix4kyz362fu

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #834

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/834/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Move wordcount with metrics to its own file.

[Robert Bradshaw] Cleanup WordCount example.


------------------------------------------
[...truncated 295.00 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-KJZ9cWrHcSeg8KLbgAUa6VWtBB37XHqYKZMAACfZov8.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-KJZ9cWrHcSeg8KLbgAUa6VWtBB37XHqYKZMAACfZov8.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-sWgp14PYhXZzSd57YU3CLDH5-Lx3iVPXa_MUj98vzs4.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-icQu3SO9m-8ycmrkp0fhN0cK8JNqWdyxwMo4pjhO3js.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-aDIihbpSOBhWHUEEeCq31WEDZhSHjpG890Rt3Ik5exI.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-mIxcL9ck0dXY-nxKQGF_11wgtsBpCiMm3T91HrDbLFo.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-ZhuP-9M0_LkxghAH_KAbGDTLdDk_GbhImitqxW7R7BU.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-GJT4N5CNT8yC4aEws7-UZQ7A9AqYsof-Oye3x-HN0_k.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-lheYBW7Q87LRtED_Q_rA62IvzWDUg3PRA9moH6gzoZs.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-FG3xz2rZOacGF6KX2Q7vSA7fpImaM7C0VlmdHvZgO40.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-NnKkXoZcPxrICgD-dO96osLOuitrfpS8NTdfM7brq5Y.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-hmq5JzLvLHctm2jmlnhLgZIvGxDnDBp3LRkKhwg6SAA.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-bP9uREFvCn7iguwat31coWIqiIF1YVSoHvLVEhOZ_hw.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-rb5YhC55jwnCmWKl6UBv5tLHHFyAp_LsDQNyJZ4q5Ig.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-hTg5PKU4zN5l-hiPDuH0QGo0heFb2aIXDh4z4kgctxU.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-NL3cqegQdFqJCFpDPokM95rzLdyS-NBbrF16963Ebeg.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-m0m6D8_h06H-Z60zmuwITrWoJG4gXrUmynAa_SiYtYA.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-IsPPgZg95OvIwcswVWYNIgDZ5oUh2tYjoDlVMGjPFtw.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-21yWVkaRvnr9aOVZbtQil2Y_TpPQqLz9R2n9nyncm4I.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT--IGP3BupsS2tRt3qNsTEZ7HveEnXBnx4TETKx83ZmrA.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2593267279065927002.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0mbd8qB-46sm4sE4MyuueesajGrDs2cWOKwf4W-qepo.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-JM1asOlYzI6IBMyCyBBMbtyvpd4RG9fr-xRuL5r5coI.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-n5MENvM_QKbpe0LBmVkgzpDeq0U_oClP5oCcq5XRwhs.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-FNeiYiBk-iLXlcDbQFH75q0AVSNGjEvaTduznZswdxo.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-j9oAZh_hc3Uea0vc9AaZP0088L655eB21NQ8r9dtElo.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-5RDUWU5-DjJwZLUocrxZg1fG5EWF0868crEfubd_wcc.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-yPZ2HiGNKdKKJsCqt5pf9PCPnxBZWYYlFAUtPCqDO0s.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-kJOCrlDL2j9j3R-jWZSDxNiqvCec8iF9omDSXX_ispk.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-auaHhoLrMS8WDmtAx8UNk7tgqZ8GaVS7zsYhGO0tTP0.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-mC00z5RCg2_yMGtchViYPQzW5QvBhqI_XUhrBDDwFpI.jar
    Aug 05, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-JNcCcqMxh3hNndl4NhcFFm4wLMZF6j82thoXtsv4u3g.jar
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93804 bytes, hash 2c665c74f5ad7d907fe2e016c8f85904715d50e332bd1f2ef5c48b2a9cb1eac8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LGZcdPWtfZB_4uAWyPhZBHFdUOMyvR8u9cSLKpyx6sg.pb
    Aug 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-05_05_45_19-62491111900391564?project=apache-beam-testing
    Aug 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-05_05_45_19-62491111900391564
    Aug 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-05_05_45_19-62491111900391564
    Aug 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T12:45:19.997Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:28.605Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.271Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.362Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.392Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.478Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.520Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.545Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:29.579Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:30.409Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:30.509Z: Starting 5 workers in us-central1-f...
    Aug 05, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T12:45:52.521Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:59.531Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 05, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:45:59.598Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 05, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:46:05.069Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:46:31.455Z: Workers have started successfully.
    Aug 05, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:46:31.498Z: Workers have started successfully.
    Aug 05, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:47:04.647Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:47:04.880Z: Cleaning up.
    Aug 05, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:47:04.987Z: Stopping worker pool...
    Aug 05, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:47:54.097Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T12:47:54.151Z: Worker pool stopped.
    Aug 05, 2020 12:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-05_05_45_19-62491111900391564 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f3bdf630-77e4-46b6-a7a8-92140a4ac35c and timestamp: 2020-08-05T12:48:00.865000000Z:
                     Metric:                    Value:
                   read_time                    12.169
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 12:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 54.67 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/3pdsrnjl5wi4k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #833

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/833/display/redirect?page=changes>

Changes:

[noreply] Migrate shared tag from tfx-bsl (#12468)


------------------------------------------
[...truncated 293.27 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-CIwnqDQcwl5DiLgGRMsrcs7tZlNJi-8fPk-XVce2CuI.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-yC-oKpF9a6ybLPcQb_El-XGrsoQMtpISBvSuLej6ls0.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded--ViMOZIqTFaUM5kaF6_liH6nRiFXnPRruSqA3ljfyo0.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-B2FTrXRM7R6-ySfZNK2nvDIt2X5vzyFUorGTcr73XxQ.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-zfIf5DUJWCnggGlkO4jJCVQmpe8OHkGqyi5hKL3qlLo.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-jv1XN4ZJVyBfVD2ZGUAlEsaxQeCmtASv7Tp3sIcEC3I.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-BRCJ3XcJWHuFDjnxbQBKa2icic8OKZaYIHYC0Fct03E.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-tQ4JEX442wusHIUJcyeelM8NVIene90mOpguyQ-peb0.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-fqZ2SPV6S2NHdSVrhbOGAzEolJCH2nNdEdw8GblAU0c.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-r3SXqCuXtHXGFuOKwthGSYiuVZlN22OzPoKtWoXAV4Y.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-3qUECf_xlQkXKpyrVeCPezK60AS80lB3j41sp2wsbCU.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-iq6IIBKI223apOvWlsKAOy6eROqnc2VZwNMHGRmo7vo.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-MOS4Vf-I8fzoQFwhCmXblMQjpwE4pbKYcjIj7P-LIDM.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-mPy4Wfkgb-iTvoV_k8yxiTjdbt3PVj2IxNEI5jUeVkI.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-CIwnqDQcwl5DiLgGRMsrcs7tZlNJi-8fPk-XVce2CuI.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test97869151762790257.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tZ_tjh7_Z8R5mnrLkO4DPVqBstoeACZ_r5rN1tUwlyg.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-YIZMQ29DAyinbUDhjCHtdT4hQlurKrZKhc68uDfSKag.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-uJqjfmW1r4jDayRg14XYWABvCXIdwbXblHUR-5xHhns.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-bIW5P8ZxGxcAqmXllfrKHh5aXv9HJxYqIXBVfRec330.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-pr7oS1eeePEuoyW0nI4TV3O-v8sklTrLmxiEQvGVKDg.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-R4MPatX0N17nVScniQBsBgFF33ATlA96wP8PNWLZbX0.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-WlQ9Altc8frkq44c7r1uSrYTm3ddPac3fbwBAQQk1Ag.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-_hhKQSFkxKFxfJD6E4ue_lW1FadE_JldQrzhMbCzkvI.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-wrftsutiYthuz28EJg7FxmXuxuQcevGoG9_C90gXlsw.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-lkv7ku3E45xQgYVpCczKsZLZbqfgKOynJnrTHhtGGr4.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-w2CcrFYdAMiKn5Dc_5VPnusR74XgI5Wz_53PBE19fuc.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-6HGN_yb8OFUpamibVQe9ZeCBYQN0cUxs8bMdbLeZ5_k.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-XErDqPhZR2DyGhgqj-u4mmR8WHzDu7pxJjRC7HExue0.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-88ONtujECXP1LA1coMQrkc0y12XpX3QXSTsCFAc7dh4.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-UXdel0twES74cZDbYttO7qVH_HWFkWnGkLEfluYcI_o.jar
    Aug 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-WIGgXpn-CAlofRCFR4aCQNyj_7e_EgdzEVq8y_Gutgo.jar
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93802 bytes, hash 8f0381a12dc0fac738cab99a3a90dfabc69f9d1efeefa1fd00474b889fb0efb2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jwOBoS3A-sc4yrmaOpDfq8afnR7-76H9AEdLiJ-w77I.pb
    Aug 05, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-04_23_45_18-6719401330595018567?project=apache-beam-testing
    Aug 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-04_23_45_18-6719401330595018567
    Aug 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-04_23_45_18-6719401330595018567
    Aug 05, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T06:45:18.820Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:27.622Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.368Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.414Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.449Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.525Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.578Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.604Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:28.635Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:29.001Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:29.084Z: Starting 5 workers in us-central1-a...
    Aug 05, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:45:56.368Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T06:46:00.213Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:46:14.569Z: Workers have started successfully.
    Aug 05, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:46:14.629Z: Workers have started successfully.
    Aug 05, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:46:48.935Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:46:49.142Z: Cleaning up.
    Aug 05, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:46:49.245Z: Stopping worker pool...
    Aug 05, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:47:38.626Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T06:47:38.671Z: Worker pool stopped.
    Aug 05, 2020 6:47:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-04_23_45_18-6719401330595018567 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ab61728e-b116-453a-b304-7ff34dca47f5 and timestamp: 2020-08-05T06:47:46.532000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.333

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 6:47:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.023 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/xrig2pakxfxf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #832

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/832/display/redirect?page=changes>

Changes:

[jx643] fixing the IOException handling

[jiadaixia] change getErrorInfo into seperate method

[jiadaixia] delete unused import

[jiadaixia] delete empty line in import

[jiadaixia] spotless change

[jiadaixia] naming change

[jiadaixia] change to vendored gauva class

[jiadaixia] add unit test

[jiadaixia] remove redundent code

[jiadaixia] further simplifies getErrorInfo

[jiadaixia] spotless fix

[jiadaixia] update

[jiadaixia] spotless

[jiadaixia] add comment

[jiadaixia] change test names

[jiadaixia] spotless

[jiadaixia] style change

[jiadaixia] add failure

[srohde] Add ElementLimiters which allows the cache to prematurely based on read

[jiadaixia] change exception handling

[jiadaixia] add jira issue

[srohde] Remove start_secs parameter.

[daniel.o.programmer] [BEAM-10289] Go Dynamic splitting full implementation.

[daniel.o.programmer] [BEAM-10289] Avoiding blocking when dynamic splitting in Go.

[Andrew Pilloud] [BEAM-10470] Handle null state from waitUntilFinish

[noreply] Merge pull request #12203 from [BEAM-6928] Make Python SDK custom Sink

[ningk] [BEAM-10635] Fix forward the google-api-core version

[noreply] [BEAM-10618] subprocess_server.py: Fallback to AF_INET6 family when

[noreply] [BEAM-10629] Added KnownBuilderInstances to ExternalTransformRegistrar

[noreply] [BEAM-10637] fix: test stream service start/stop (#12464)


------------------------------------------
[...truncated 297.54 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 05, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-_EnGRt3pXCH7ilfF-OImqEzFJrjNbgEICwpfrMfJ-4s.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-xa6fLZBofQ-yRarIiEbOY_I2EkNou4vtfewlUsAqMGU.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-_EnGRt3pXCH7ilfF-OImqEzFJrjNbgEICwpfrMfJ-4s.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-MdJnx2utixNsPmrYSXjJOWxELNI5_JVYUCUBdcjOqPY.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-Aqcwxwu5M2OFl0mSbIEqFZZN7rsVlTy29OPPEKZzU20.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-jIGfYoRBNz1WF-sfRsLfxJZPX4sKmOX_TeGVP5v1cok.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-3lXGTaQO-3TmIukff_y_qWZN8EGN8FIp04qM-tl9QZI.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-jYJWmBAIROEDvHHHzCN8qA0eSU7Q4sAf585POy6KAzI.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-D1Cs5nyidkiCYj3LnR4sgUAyWa1XuOxnUHC_njmzWtQ.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-09Buzvkr8Fbz8Z6_0wST3dhXzzAMJcN-oHymU9grRkI.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tV40fcDgyHbIBdfRfWGTE3EWwOf7KfWA5l5Ffx6o2pw.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-2g_Dez7HXz5k3rYHVt9Dkf2gl-Q1ITmYOcBpSfR7hVg.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-8e_wFdkCd7lGVFKLIUjNBVpWsq8MKNj4fATMvFgvDj8.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-D4Eej83wtfkRWq_HgTb7O-AHF5-AGdZ59YWVTKTt7cM.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4958846721888975974.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qSgv2WUQkWt8frP-Up2hKQXBEYzboc1nitlnTWvVrKU.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-I8r0btRJE_WAptAjpliRLrcOkFV-TtQxRRpTz30bnzk.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-uTblnExOLCpL22BE55ObXqUGfG3qy35_PYMI8e-oTwk.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-Yglm5SbTaRf_3uYxQeDhK-dqvE94hlrKnQird_KRZmg.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-I4B_i10xj2hpJzWx1-fE6mhjVDuFBvV15vvouvhyTQg.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-4c53ofzEVZZpAznnSWJNGKkRsfrlO-VM70j0WajOsJw.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-8WBrOOPkoz47zossdacPDoByNiTglyxAujl7u8V4lNM.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-swNsAoT4tYx4LuKHzjXpdLVpxPcx5XKAS9RZZu3WPdU.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-Zz9z22oz7FQJR_MI3krM1aJU1wH5Y0042JUEEy2ori8.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-EEishxjhKmWMFDLWPM933y0SwZWf5gPzM_0j09c6ANw.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-zNOq6zrUtrGYC7rn1Vy0VEKbG38IA5YE3wBQkf18Lmk.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-idEU-nFNHJn3i9V0GL9OASlx5IgBYiiAomqf8Ew06SU.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-Ad5PpwqfyGxIvYFVUhJ8iD7NXZIjotUt6f9mbXkTlTw.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-Rqj7VwVO_S6TokuecbgGjRj1RGEwtAYkQ0i7w64rR_4.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-iIWV6_1YliOxBrKNSUaFv1_a-aTXxXCEH76zXZNWSqI.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-qPI_dK_bLrMTBRvU8cseQp1lfbvBjcvnnzDg0J71_54.jar
    Aug 05, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-bhfMHkw6OLIWvHMZ2uwWl4qCED_o7cE6sGKz_U9T0us.jar
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93804 bytes, hash 9114cd368deb6b41656ee01d979ca1c9e607b41a60d36acb342262deac9dfe51> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kRTNNo3ra0FlbuAdl5yhyeYHtBpg02rLNCJi3qyd_lE.pb
    Aug 05, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 05, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-04_17_45_42-16659283530607027882?project=apache-beam-testing
    Aug 05, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-04_17_45_42-16659283530607027882
    Aug 05, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-04_17_45_42-16659283530607027882
    Aug 05, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T00:45:42.983Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:50.898Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:51.638Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:51.838Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:51.864Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:51.939Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:51.985Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:52.023Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:52.060Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:52.445Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:45:52.516Z: Starting 5 workers in us-central1-a...
    Aug 05, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-05T00:45:59.500Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2020 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:46:23.032Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 05, 2020 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:46:23.069Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 05, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:46:28.418Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:46:43.890Z: Workers have started successfully.
    Aug 05, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:46:43.925Z: Workers have started successfully.
    Aug 05, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:47:20.947Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:47:21.156Z: Cleaning up.
    Aug 05, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:47:21.231Z: Stopping worker pool...
    Aug 05, 2020 12:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:48:15.725Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2020 12:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-05T00:48:15.771Z: Worker pool stopped.
    Aug 05, 2020 12:48:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-04_17_45_42-16659283530607027882 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dacb8240-7a08-4419-bf41-1791700e7553 and timestamp: 2020-08-05T00:48:21.565000000Z:
                     Metric:                    Value:
                   read_time                    17.771
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2020 12:48:22 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 54.92 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 5s
106 actionable tasks: 76 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/igcgfjv3f4zjo

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #831

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/831/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add sample code snippets

[ihr] Fix add field method in SQL walkthrough


------------------------------------------
[...truncated 292.97 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 04, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-2LscdFb1h-iMzE3zL9PTzojAchFoT5dte3r-QPg5W9Y.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-le3pzzqUPzIHVdJfLQ1m6WxCQg-Pe044AgJaZhrKI_4.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-5G0rST1scFRcAYn8QSStNKR1u7gAPjKNbDKI7Cwrbko.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-a79QUC3r4RBOsKbVNPR0ggUeap9rmn15PoF9Y_zZbn4.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-XoO-7qhrqMyc-3di2nvZgGAqChCMt4cq2-jmyjcReJQ.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-rSFKE-epbIwPSHQewYNPd4K--GZM6SAzCXNgVEPVvb8.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-2LscdFb1h-iMzE3zL9PTzojAchFoT5dte3r-QPg5W9Y.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-yjROpN-zalD0bi4_ChvNcNQQEQXpVjtkEvsfahRzlIU.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-m7rKxw6TFF9cs22dgeNuqKUukEq_Ip6NJWOVsepN7bQ.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-MEUQ3BRjKWKZwhY-LpEO4EXjcC31f074HEDixiCCPfI.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-4QATsNRXbSdjc0vwcvIWdW3B7EULdODrmV2cHHu95Qs.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-hxzZhPKFD0ccNvHpeAy4rJKP9irB5fLghYGSxZUHD5M.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-NBtuoymp2WnDYy0DQ66pNu_GXZKQr5RfqHuW9Ityxvs.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-L0SClLT77iqrziJxWoDOb6yGioivlic2500wECVPtbk.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-k7iTMzpQxi1_WCn6h93TCzKybc_ipSMwxkQtY-SoJA0.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8318854515346624574.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xZw3376s-MRwYfQmqFn9UBESptikWlEkKe2an86nr9Q.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-ZehZ8ifa4BY1sCiiPyk7d3lZm_dBKsoukOqdVnLl9Pc.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-DN8_N2bZxAmF5l5GPr6Y0M1iiDgGze-gnEcXrjLaMtw.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-zfvewX_hYksaAcZRPRdzs-c1AiJEVDCG0FFytOV2yAI.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-4TAYoDh3yaIYO2r9hJf5vUIOrla8_SS0K-RC1DBo4fo.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-hzjvageizKd202aErMcgtzmvZnXLoeqFIcMI5039p9k.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-H3mt-PJFPPqCw266VmSb_8hYhPLEMdf5ZBQKjOWdRhk.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-e0ydxKIXqTLbOuf5KPI2WgnWaYyfiryn7YTsyaYYHI8.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-zA6tQVC6M-SrnBn5UB8X5iMxps1LOMLkjlkeHDxeptQ.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-mn6EiIKNG6gtNmY_qDr-8ZcKbwgT7D2U_sYFz_n_mt0.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tVdgytmUOJ-tyWSo1y3BMFCeWAv86dYEsjTo6JUn-98.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-qQl_aXYKxrmZzfPHMjQ6dZhcIDOqPnm1nfUPicCTyhY.jar
    Aug 04, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-71pSJUlckM1ME3-m0tvoVcJVAE6y0EUDwLpCHeeKduQ.jar
    Aug 04, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-OihKLHLpD9zPkTDDQ3QZ5Ej_02oTzlwoMsKapBVWCkw.jar
    Aug 04, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-KX9nVVVFqk0NEeJ7nq15oAfe5CZJ0f3lEEO8SRw9F-8.jar
    Aug 04, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-2YaOZv5PkCWZsvBhPkJ4VF83qC2sT6XbW11_WM2_M94.jar
    Aug 04, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 04, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93804 bytes, hash f809764feb316cab09aabd2d677c909c4e62e50c141968e0fca5420d8b1993d0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--Al2T-sxbKsJqr0tZ3yQnE5i5QwUGWjg_KVCDYsZk9A.pb
    Aug 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-04_11_45_29-16953589181054636957?project=apache-beam-testing
    Aug 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-04_11_45_29-16953589181054636957
    Aug 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-04_11_45_29-16953589181054636957
    Aug 04, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T18:45:29.505Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:37.689Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.485Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.514Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.543Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.611Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.638Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.663Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:38.700Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:39.092Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:45:39.184Z: Starting 5 workers in us-central1-b...
    Aug 04, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T18:46:06.717Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:46:08.490Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:46:28.635Z: Workers have started successfully.
    Aug 04, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:46:28.664Z: Workers have started successfully.
    Aug 04, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:47:01.967Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:47:02.179Z: Cleaning up.
    Aug 04, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:47:02.269Z: Stopping worker pool...
    Aug 04, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:47:50.877Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T18:47:50.924Z: Worker pool stopped.
    Aug 04, 2020 6:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-04_11_45_29-16953589181054636957 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d57b66c5-18d6-4e8d-b7d1-b11bae0460fa and timestamp: 2020-08-04T18:48:00.702000000Z:
                     Metric:                    Value:
                   read_time                    14.209
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 6:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 47.886 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/zgmayhov4s5oa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #830

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/830/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7632] Add Python quickstart instructions for Flink and Spark.

[Maximilian Michels] [BEAM-10622] Prefix Gradle paths with a colon for user-facing output


------------------------------------------
[...truncated 297.32 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 04, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-Z2sSGFsRFzSKs7NmZb4zbbQPpbJfgWqYxgc2OfOa77s.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7955736401748410305.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-otL4gRA9D3OIso7tRVyHTQ3PNGM5peQpC1seiMxM41g.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-3tdzxTmAL0zfeG3pkHU0Y3iQ6xJmixl1KYcotgC8tLw.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-Z2sSGFsRFzSKs7NmZb4zbbQPpbJfgWqYxgc2OfOa77s.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-bD2Jd2UhEku9NkZahUcC87jIqehPvotBo4StAssWcFI.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-EdePTo93tF2el9-z1NvT5pVUDj6Vcyxaz3zoH6USw60.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-qeXNpJlxrKk27EGda7NBC2qQZy9KrbpNPQxhHaIMbiA.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-oA7oiSY_ggqwLhdIpPC-RM-QzdM6hIs1UrEb7DFevmA.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-1cdyFpyigE_RskAMySYy3QduTRIV3JvBM9o9ifKBu3A.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-sts356qwlAV5cJ8gUvgDjxySdOGqpumv_UsqdWc2IO0.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-hxdzpeWzqSqS1dhjzc-h6gL0o95WXWmCSA4yg84Vuus.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-K92QEql-kMSqY7i_gaIreu7FxyXDurzBVQz3xqiwE0Q.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-c8UznBsHKEkVNUFTvIInoelS6QI-AwYaraoKQ79giMA.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-3Z-nzoqcl6XyLRuCjQI85dAjjWgPFKxUE1g1cBImmUw.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-cvhEUa5mMFVsmkGAZo0iQTzuQStg513bKdn5vPVYZmE.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-0RRB27X2dRy7zwKkGdNSnIKWkjlF9KbrLM0OF8sE0y8.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-e0M6ul453s8HqpqMr6H1aQZrgSm1nIt1e0FBonc3Mrw.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-N2-P0i-a710b4xsTZeHWcZwsbnt_MAznUepWBe7e9uY.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-GIGm6HmxWU2cRtlyJqBz3CefPBUx7r3UrqGBaBten0A.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT--eqM_q93bjyhpo9cjUUAo7fSQK3abVJUGLi9bBMpRt4.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-JW7Re7L-qz8oHgoT_dE7fZhPUpNhMRcD4ihfzKgGd2U.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-_ACG1YWo8Wn9HuU2jKLtAEd5jRo1mrRvbLbPMG0yLBo.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-Jwtm6uxsD3nvX-SmX4pPSuanhPslY95VoO3tVaO3NVE.jar
    Aug 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-0tHG5YdRvBUtPcd02Vi-gH_QGv5z16T3DpMMxpbkT1E.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-d_UVy46zXN_WYnYoThu1eVhuWa_hhR1AlpkdZhCf-e8.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-qELyMANqzc_KLvtq8ona3W1ZxOWUSy9aLfDJzuch_wY.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-TZ36R8C5B83fBIEUgyf8lDckT8qDcfNjNQnYLgEiUtA.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-JTNDj1Dmy0EB3e8Ex5zCGh3fqwSbJHC3K-LbL0_kDIQ.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-_TKQEgTRRWbmpXZMUYDFaWbJ1BM3C2McBdBLrIHLDx0.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-1uUxIIeDzCSOJeBp-DarfZkvelxejzy42cw6qvPciqM.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-fOh63t6DwUI5zIxwpLRTFv6bOkKXsbAslFxj5Gzzjbw.jar
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 04, 2020 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93804 bytes, hash 2189eefd1a56aa3a9ef29be5e89f38da21f6275ce33d41af4e6cb442cc69256c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IYnu_RpWqjqe8pvl6J842iH2J1zjPUGvTmy0QsxpJWw.pb
    Aug 04, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 04, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-04_05_45_53-16637608913194490727?project=apache-beam-testing
    Aug 04, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-04_05_45_53-16637608913194490727
    Aug 04, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-04_05_45_53-16637608913194490727
    Aug 04, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T12:45:53.573Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:00.848Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.640Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.680Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.709Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.781Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.811Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.838Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:01.867Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:02.366Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:02.447Z: Starting 5 workers in us-central1-a...
    Aug 04, 2020 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T12:46:25.727Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2020 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:27.426Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:44.851Z: Workers have started successfully.
    Aug 04, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:46:44.884Z: Workers have started successfully.
    Aug 04, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:47:22.768Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:47:22.965Z: Cleaning up.
    Aug 04, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:47:23.054Z: Stopping worker pool...
    Aug 04, 2020 12:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:48:08.022Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2020 12:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T12:48:08.077Z: Worker pool stopped.
    Aug 04, 2020 12:48:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-04_05_45_53-16637608913194490727 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e7a721cb-0685-4be8-a282-2c4f4319ac7a and timestamp: 2020-08-04T12:48:20.013000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     20.11

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 12:48:20 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 41.286 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
106 actionable tasks: 75 executed, 31 from cache

Publishing build scan...
https://gradle.com/s/6gx4czg3fftca

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #829

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/829/display/redirect>

Changes:


------------------------------------------
[...truncated 293.91 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-JGH0iqG5-yW8rrxT8z0FidesDSq4a2NvAghYp7Qrj64.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-mopPGUcPcspo4obzGDp3J0STUyqhnW_Otmd9fRhg4yQ.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-qmc7c6xo0gXy3oRiKvi7BkxkjkgF3ZyJM-tBPdp_7qY.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-ekhPHFsdyEbsanUwZXQ8L-m6IGCjJBnltdUxvWSUP34.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-Bs9EA1pvO135XMu5ZQxH4NNZIPX61jug4Q9B1bgpWTk.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-VLJW3ML4D10L4dCVpmqxIKozXZdlctB6xC8HbRKhP80.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-y2fW0zc5442N1ylNOxYGHzEP6BUqcnqnu8WpResUI4w.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-nXM51SxjkE5YUgmWn3-38H38uCPyYmzChSZ-gAB1Rx8.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-stq7cHRlM1xU6MApeIoa83bEBKHee6wJoH20ySPQoe8.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-zCJUSeH33XbcLgj2PpUlopwjOSWCabITU8Ax3p2LZ7I.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-OEJLnBy78xPoZgiWHemnWS3axv_g9EEevq8hJkccHIk.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-qGQ_mLgWU5c27eMe4VzakjLqher2DB168bafDVy2feI.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-y9zw8QlNc3I2-ZnlLpWe1L7B9u8PvxWvMH5PpwQ6Bt8.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test80945058648309513.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Z6bUl3HeuALdyWRuTDguJhoQaks6E_tUCG8ToUzNl68.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-JGH0iqG5-yW8rrxT8z0FidesDSq4a2NvAghYp7Qrj64.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-LkG5S9C0FnPcIxrhU1esTDyO3SvWg2DLWA7157ND6Rc.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-4vDSdG-S-tR0EtWJOqjTsXW0x-IAoEEt-_bMCvRObTs.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-Wb6IA8h1iFZUsRXCxU0RlPbnQGrDUA7IWNOit1QmWqY.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-3Kqq3443DqBbfdauP4fH56fw1K2ayGKn6agS8c41GuE.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-rtXIxQfx2JDjRj515khi53a5Fm-f4GjwewVieVyqkQk.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-dayIAEnCypqB9237dEBv6ne_gppLeOtdyFEWPPygFR4.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-2mkI2RKHV5gLz4sQgfoMWYYQgQH_d_cQvWu088kB4as.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-5sHiu2ywMdl8OL5_xHxJZSQGNhhu0RAXMk1zcVLmOyQ.jar
    Aug 04, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-oLOpKm-jNsRpCIRUJB_4OKFIGJEkRAAJ0pFbiJEKcKs.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT--UjA76HBYOU261zpvyZCYlOJEuhQgT2I_s1O54ex6Wc.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-R9bvlgCnEZxLuPZeBYs4KFHXLxeqO91Npn2TUjNoYTs.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-eblvcqvqo2AI-1-rnZL4pS0VTHSCyaSMz_H4GdBh1Po.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-UDyXkB0-bXjw-q224sxAzxKR84YgQKz_j5ryu_iEWy4.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-UmKmLeX7RTTpJmbRWRIgV7v-6p1j6q2sDbXmzYb-_eQ.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-L5eI_olis3tgcKVGWIDOfFoQtbIQEHUw7pNVxFR2srY.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-N1HYnqjexw9RiUVUS4JmeiAbFmWzzHPki2gy2ckeAxw.jar
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93802 bytes, hash 1a5358caa0366d16f28dab4ffd4c5f1a0e121bbc0c5cfeceb2fcc2a80aa2c1b5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GlNYyqA2bRbyjatP_UxfGg4SG7wMXP7OsvzCqAqiwbU.pb
    Aug 04, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-03_23_45_16-10949142640136742042?project=apache-beam-testing
    Aug 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-03_23_45_16-10949142640136742042
    Aug 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-03_23_45_16-10949142640136742042
    Aug 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T06:45:16.489Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:25.044Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:25.802Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:25.848Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:25.875Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:25.951Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:25.987Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:26.021Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:26.057Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:26.454Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:26.530Z: Starting 5 workers in us-central1-f...
    Aug 04, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T06:45:48.030Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:45:51.708Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:46:12.053Z: Workers have started successfully.
    Aug 04, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:46:12.093Z: Workers have started successfully.
    Aug 04, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:46:52.910Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:46:53.183Z: Cleaning up.
    Aug 04, 2020 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:46:53.318Z: Stopping worker pool...
    Aug 04, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:47:45.531Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T06:47:45.573Z: Worker pool stopped.
    Aug 04, 2020 6:47:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-03_23_45_16-10949142640136742042 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): aae94b85-931a-454e-9274-bdb10281d88f and timestamp: 2020-08-04T06:47:52.058000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.598

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 6:47:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 49.242 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/llxef7d5yzw5i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #828

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/828/display/redirect?page=changes>

Changes:

[Boyuan Zhang] Refactor SplittableParDoExpander.

[ningk] Fixed pcoll visualization in datatable

[borzoo.esmailloo] [BEAM-10499] Adds a descriptive toString to SamzaRunner KeyedTimerData

[noreply] [BEAM-9839] OnTimerContext should not create a new one when processing

[noreply] Fix dictionary changes size error in pickler.py (#12458)

[noreply] [BEAM-9891] TPC-DS module initialization, tables and queries stored

[Boyuan Zhang] Address comments.

[Boyuan Zhang] Fix SDF/Process input id.

[noreply] Interactive: clean up when pipeline is out of scope (#12339)

[noreply] Merge pull request #12331 [BEAM-10601] DICOM API Beam IO connector

[noreply] [BEAM-10631] Fix performance of Schema#indexOf (#12456)


------------------------------------------
[...truncated 293.97 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-F0UaWbrlvaDIFhrwxs8MBKc0veqquNyTTmxD8_b-J_M.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-FkAlgDi2R2naO8kxhyHpvBebi0VDbuaLvtsloFBuxvI.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-TTIl1Ls4IiGb8ZJvfj005gX5yz_f1mpZIM1MllTj4Ek.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-0qx2k-OH8nHQBY4ptNkumOWZTRdUm-pYypaSuWqYvJY.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-VxalatcXN1Wm2_cdC91YboF3n-f8FFaAoQqXy6SMHvE.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5752555465711511101.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-rxd-jtJ7UyWc-QjnuaB4Qdl-ZoLHcdAsRkLQ4p1yuns.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-fx1pwT0E70vHfKOrlAFcq2j5LgB8MhxsAPWy4_BMxXM.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-eTytSkDYpmf_Gkwrorjh0AGrtEt5cfHMj4iMkyCVXh8.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-KqaurezKFlDhqL4pMWuOgORv5Vks9b1AhHaOOxNyhA8.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-BtsSwfeq4_riLuD5Lx0AYV2FAjf5EgwUbyV0-n_Hv5o.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-tCp1DAlZIFOcjHsFY52ZKu78irzf5e2FR4JjojNWEJM.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-XTTSoKcfY3dD5TdLiBdEiCl_6cXOOr8q6HHPP4kH56s.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-oEg3jxCRn-PAzoUX9SeGSqut1NeZAWQEAbSv-0L2-u4.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-19KRSB5WriNFTaUeFROAOn8oVL_TFQfv473PsLAqwIE.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-E8qPBqw51_QpM4PeJGZiC-vFnRjCWe7o6KqfEd-4sWA.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-lBFIkwNSqBrkBKglrhO07D0e1hgv7DFss19jTIE9Lno.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-hS3Fl4LSKpJlLKVmahpsvAAhKZV0Aube2LDw65E3R3k.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-65pholSTNM5J5asxkTnq9OKbejYhRqA-ZXeE2wzwAR8.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-srHrVDiIYqNXrk-ECJTvcugUJPPcRBj1Km7fuWFOzuY.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-TY3bC5B1bCX7E2BsQnF0VYSL_IOqYIWjsT82jP2nwH8.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-hzNUSGt638yTw22r_ZI0Lj3Sofme-fnsl9dVFXcsmPs.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-Dz0jrAbTUyOioZYvcWODROZkoinxc33_UG5Eso-VpzA.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-VA3dcxrPVe_TilxRfsQaQTowuDhACWSoQKbOEyM8GOc.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-lVnLFblCESrFIF4ZFK0TwzXoFPH7QEoCmqYx1tB2mjc.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-a8bor3_WRNX4yENHwen5cHLuYMGT-ACjQO3Wp3-yxOs.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-7GQ_yZV_3sKucbeL6qi0MOXVsoiKQc_XbES15nVEY9A.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-F0UaWbrlvaDIFhrwxs8MBKc0veqquNyTTmxD8_b-J_M.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-WAqesfrNmjbeALQlGnSMXCCjkIxFIPDeIbeufAiP5h8.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-s8Mg_ew9jGimHs7RgCTrW91GYWHcclPAxOpnyhKImUA.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-v0iu1T8BZmGXfcDNrgABRXr6LhPmKKhqcou2wwX91BY.jar
    Aug 04, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-8YaRsAIJPFfmq4pdzu7xj6JAyH7b2T9LmMHI0SFNoOk.jar
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93804 bytes, hash 301223f8a381159434a55ed3e4146727c8b525d9061678506d44230bc6d8b82f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MBIj-KOBFZQ0pV7T5BRnJ8i1JdkGFnhQbUQjC8bYuC8.pb
    Aug 04, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 04, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-03_17_45_44-7720245088131745463?project=apache-beam-testing
    Aug 04, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-03_17_45_44-7720245088131745463
    Aug 04, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-03_17_45_44-7720245088131745463
    Aug 04, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T00:45:44.892Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:52.205Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:52.982Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.044Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.074Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.197Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.225Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.257Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.292Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.693Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:45:53.772Z: Starting 5 workers in us-central1-a...
    Aug 04, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-04T00:46:01.553Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2020 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:46:21.675Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:46:46.856Z: Workers have started successfully.
    Aug 04, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:46:46.893Z: Workers have started successfully.
    Aug 04, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:47:19.955Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:47:20.191Z: Cleaning up.
    Aug 04, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:47:20.287Z: Stopping worker pool...
    Aug 04, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:48:13.205Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-04T00:48:13.259Z: Worker pool stopped.
    Aug 04, 2020 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-03_17_45_44-7720245088131745463 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15b8aeaa-9038-4dc5-8589-b97f7f13d25c and timestamp: 2020-08-04T00:48:20.968000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.068

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2020 12:48:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.016 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 51.059 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/damugcurzbxg6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #827

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/827/display/redirect?page=changes>

Changes:

[hsuryawirawan] Update Python Katas on Stepik based on the latest pipeline "with" style

[qz689] [BEAM-9543] built the basis for Match_Recog

[qz689] [BEAM-9543] built the basis for Match_Recog

[qz689] [BEAM-9543] implemented `partition by`

[qz689] [BEAM-9543] implemented `order by`

[qz689] [BEAM-9543] fixed `order by` coder issue

[qz689] [BEAM-9543] fixed `order by` coder issue

[qz689] [BEAM-9543] applied regex pattern match

[qz689] [BEAM-9543] applied regex pattern match

[qz689] [BEAM-9543] fixed sortKey serialization problem

[qz689] [BEAM-9543] fixed sortKey serialization problem

[qz689] [BEAM-9543] fixed serialization problem

[qz689] [BEAM-9543] recognized simple pattern

[qz689] [BEAM-9543] recognized simple pattern

[qz689] [BEAM-9543] fixed code style

[qz689] [BEAM-9543] supported regex quantifier

[qz689] [BEAM-9543] added javadoc

[qz689] [BEAM-9543] removed CEPTypeName.java

[qz689] [BEAM-9543] added Measures implementation (unfinished)

[qz689] [BEAM-9543] added Measures implementation

[qz689] [BEAM-9543] added Measures implementation

[qz689] [BEAM-9543] fixed minor issues

[qz689] [BEAM-9543] fixed minor style issues

[noreply] [BEAM-10599] Add documentation about CI on GitHub Action (#12405)

[noreply] Fix link for S3FileSystem (#12450)

[dcavazos] [BEAM-7390] Add min code snippets

[noreply] [BEAM-7390] Add max code snippets (#12409)

[noreply] [BEAM-7390] Add mean code snippets (#12437)


------------------------------------------
[...truncated 303.29 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 03, 2020 6:49:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 6:49:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:49:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 6:49:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 6:49:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:49:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 6:49:36 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 6:49:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 6:49:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:49:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 6:49:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 6:49:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:49:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 6:49:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2020 6:49:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2020 6:49:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2020 6:49:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2020 6:49:48 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-Ok8RZiJkiQLGVs0EgOeaekBxDcp_AQkpJqxKOuItRI0.jar
    Aug 03, 2020 6:49:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4921313479952871895.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-j-Wa5rKu8tkRnh8lAtTB9RjXj_sYJZ15mRneqF59pbc.jar
    Aug 03, 2020 6:49:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-TQOJUYTLgvkJ70USyO-WOXHk9SPXw329P-O2gpspZqg.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-m0jrz8gpzBUK7EK9TI3fzPCkdZ-DyJUHia3BGINfCsk.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-SPeMBbpPOBb6zePKpYZNnKB3RIenB_AWfZcDnlQp8uY.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-CQHWcmeHC3yT10O31jhrZ7Ysm0H_KahU92ONIMoV958.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-XfqqXceZHGoMCv8UlaVyhu8cGPnksAivgVGSfbi5WFo.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-oIA71EToKM00TyVFIPumKW8OjCx9UCigakA1y-1jlMA.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-gXBknuEFalcSmqQy0EHvTC16GTY9YPIJjxGDq9q5v5M.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-f70sE0R2F-02cObD6fES6OZo2cmRwHdg6nljk6PSoG8.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-KfGuWfCrNwjH7g7xdv2UmiWg9tL94BnawNvziYIrctk.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-BK2vUvZZYz4RiyaVveTTl_77CcoXxbG3BS5P2nt6n2w.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-Lpp9MLG6XQpHMaLdHt06RmoTj3c_FoSo119VvmeATFY.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-Vhjn_V1nqP_xaL0EZdrChQjEpruIlSdsBhTdOYrlmLs.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-8gi4UpPMz8TyARrjdE5JGRPg6D0Mcon1d_JREizD5xA.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-ou9n62D4B_82pi1gvi7R0pZsjM5WTQPA5-j0Hr65Ne8.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-6uZ3jO99qwzVkrEZ0ll3UE5VRtv3UHoOuubdRtB6Jqo.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT--2QQdvpNuRNbiwAGAMg70N56v7cEkTNeu-VqOHOdOuI.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-Ok8RZiJkiQLGVs0EgOeaekBxDcp_AQkpJqxKOuItRI0.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-zvifhvmEENDoE0jNBTFNYmVBqmFV15KUSlKAqZrOBXI.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-pENF_Ff6Jn3uxookHMKUizAHfslQFHgVUrAtezBfr4U.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-A3NIjqOt57q9DrR5RYAs7ilunRcpCR-HZY6RNeSoA7Y.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-OOzlj8JRE1cqQDah2seoSClHVoYETX4I3jUUNNnfK6o.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-xN3arNn7n2pPyKXXi7uGa5UJcTcKCvmidr8isChm1IA.jar
    Aug 03, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-J_FWe2304jGnV7wEoCzbfExc_jRfW1NzoeQw7zSP--c.jar
    Aug 03, 2020 6:49:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-ccwzIe8QK2CCx_ag43VEvC4T5DOuIy04vJL5HoPCYiQ.jar
    Aug 03, 2020 6:49:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-7n9heNCkskBKhgH_g3pGBA69gq1LOEKTxrxzcRkyyfw.jar
    Aug 03, 2020 6:49:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-pFD7cPKMju9aeOX5lMhSfCgIuX_qUGiGU_C7WONUqeQ.jar
    Aug 03, 2020 6:49:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-nuchQIANYP2WfGWYGDS2ESoLwYLCaVe3dOxYJq3PbbI.jar
    Aug 03, 2020 6:49:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-6qyBvhQwgKL886p1csUKswn3u_qyndqSqCPcN9M-pGU.jar
    Aug 03, 2020 6:49:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-G-ah3dTJaouVXdk4vObki5oB1nrKv7M_llHCzpWpJgU.jar
    Aug 03, 2020 6:49:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 10 seconds
    Aug 03, 2020 6:49:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2020 6:49:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2020 6:49:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2020 6:49:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2020 6:49:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2020 6:50:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93804 bytes, hash afb83bb0ae650e23dd3bb6c72674be6560dab28188d4f629e86cb010ef7ad745> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-r7g7sK5lDiPdO7bHJnS-ZWDasoGI1PYp6GywEO9610U.pb
    Aug 03, 2020 6:50:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 03, 2020 6:50:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-03_11_50_00-6971827836742586088?project=apache-beam-testing
    Aug 03, 2020 6:50:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-03_11_50_00-6971827836742586088
    Aug 03, 2020 6:50:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-03_11_50_00-6971827836742586088
    Aug 03, 2020 6:50:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T18:50:00.897Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:09.113Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:09.815Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:09.899Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:09.927Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:09.987Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:10.011Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:10.029Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:10.054Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:10.372Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 6:50:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:10.513Z: Starting 5 workers in us-central1-b...
    Aug 03, 2020 6:50:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T18:50:36.545Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2020 6:50:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:42.281Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 6:50:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:42.301Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 03, 2020 6:50:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:50:47.625Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 6:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:51:04.638Z: Workers have started successfully.
    Aug 03, 2020 6:51:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:51:04.664Z: Workers have started successfully.
    Aug 03, 2020 6:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:51:47.939Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 6:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:51:48.122Z: Cleaning up.
    Aug 03, 2020 6:51:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:51:48.191Z: Stopping worker pool...
    Aug 03, 2020 6:52:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:52:38.911Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2020 6:52:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T18:52:38.940Z: Worker pool stopped.
    Aug 03, 2020 6:52:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-03_11_50_00-6971827836742586088 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 70a66886-7e46-4dda-988e-eca11e657c5e and timestamp: 2020-08-03T18:52:46.498000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    24.016

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 6:52:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.152 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.093 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 35.006 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/nmo2epxgasfzy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #826

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/826/display/redirect?page=changes>

Changes:

[Etienne Chauchot] [BEAM-10471] issue a JMX call to cassandra

[Etienne Chauchot] [BEAM-10471] fix a wrong comment

[Maximilian Michels] [BEAM-10602] Use python_streaming_pardo_5 table for latency results


------------------------------------------
[...truncated 292.98 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-OC0ASSBQa4EN_dBJimuXzzGcQdx-8kvphKtBaob-Ui8.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-1Dvfz4uUIRFPBM6IgU5QXA-WvN8mcfFcZCyQyJevW_Q.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-TiSKgDRTOSkNPhWePc5lEUozSpyMEsgFUHzaBM9C778.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2854437998634022312.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ranfF72KOi7X7AH83xYCVTL6hc0aqm_goofLcBvSFU4.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-NUIjvjN7ji5Zv_g4hM1Lx9wu3ciz5o5qqliZIC_UmrA.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-8OcWuM1no6c2MmOF-Wkc8zESOOgJqBXU2QD083ktVU0.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-XQeIxMwpCp0QTzsdjKypdECE3DouQ5lR2ywOOhoCJ5s.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-hKWQ77mfvIFCSH1t8LCRATsdWBeLnrgqQEXdtDO82zQ.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-pOGWGl9Mu57GSHJ8D_BFAyVGdOmpgDzJwwa8TnInj84.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT--NNRJoz3b_o5kXicCU3yLnJITzOdgr_AY5c5uVZ99b0.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT--X2cwPDX-Ko8nOjAxGRv06fhVCadxLggJcTFFNH1Jz0.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-JOlfDabuq_fjRU7dz7alPF3Jc-2A2zZlQD9Pthg6vn8.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-g2sbLCmXqnEZ5WLpwBv3sHUuvhvj7KSv6ELSniNvKN4.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-7NY68eVcnuEi7JsWqv7gge_N2TGQZHh1AndkiiGjSDI.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-ephdJECrOubxU_UHQ305trYAyfcp7IGDlNvm42G-3uw.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-QJNRS8oBStNlrLa9zrLHLEXn-aJjyYRa-Jtvk1wKQA4.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-oVK7XE7x5A_YFyU7Y_nLUuoKrDXeVKnfN1rr4HexIiI.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tBGb1TXZj6y6RIsn1S56J4HIVKmQcDRAbXHgmo_pTqA.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-lRL5nDta1bDy5CJa8sUTvNOctLhjiS17Ku0DBrU0YvE.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-0gHE4nNvMnUmLSgUGU_i9RhpR7ZDmWr3kjfArDrm-IY.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-JKzwmAnYiNa-DoW0jkm_cLspeMvOu_JUsClqEiuVkh0.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-Hc9n2I2QEI3DPLhEDOJj8tgaWTH_-oeHOyWAQ5vPnjk.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-pl-ptnDrZsCUnRq8ftQUvZEeLHnhBxtCrUfWBXsZ_7o.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-OC0ASSBQa4EN_dBJimuXzzGcQdx-8kvphKtBaob-Ui8.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-385fKg93ClCJU9Uy4yu_ll8Y8aahxdihNJzVxKkB-Ws.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-ptz-pe3AJKgNdKiydI68EtyPJcxATzRb4LudAZeW_1M.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-nKNODp3nDFppkT4Qz2a0FZUpc-bLuBHGxBzSVULI4nI.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-FWSzfAbEc5WOR5qUNHR-jgEun8RmOj_IgS1X1HhBtcc.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-DmCo8htOH3eD6EIlEa7XZQVTkWGmps9ztVRcUVctC2Y.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-4wJoLjpiWdoK-L37AEaEFgoOA84okZ7aD9yhaDgTnXY.jar
    Aug 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT--QeqFoCY9kkE3WXd4R7jaavM_S0pVsygHYrR5wRIvt4.jar
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 8d1f1edde50cc289ca0a6b73b048650992833966174fb7c74c37fa5e04402448> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jR8e3eUMwonKCmtzsEhlCZKDOWYXT7fHTDf6XgRAJEg.pb
    Aug 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-03_05_45_15-15013796924992546995?project=apache-beam-testing
    Aug 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-03_05_45_15-15013796924992546995
    Aug 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-03_05_45_15-15013796924992546995
    Aug 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T12:45:15.784Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:22.922Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.561Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.596Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.620Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.689Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.720Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.746Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:23.772Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:24.097Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:24.216Z: Starting 5 workers in us-central1-a...
    Aug 03, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T12:45:33.476Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:45:49.276Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:46:14.473Z: Workers have started successfully.
    Aug 03, 2020 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:46:14.494Z: Workers have started successfully.
    Aug 03, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:46:51.518Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:46:51.792Z: Cleaning up.
    Aug 03, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:46:51.874Z: Stopping worker pool...
    Aug 03, 2020 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:47:53.155Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2020 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T12:47:53.193Z: Worker pool stopped.
    Aug 03, 2020 12:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-03_05_45_15-15013796924992546995 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 65ccc15b-aeb0-4b4a-b669-fa884f3087b2 and timestamp: 2020-08-03T12:47:58.965000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.623

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 12:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 56.733 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/goju65vh2pdgk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #825

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/825/display/redirect>

Changes:


------------------------------------------
[...truncated 292.48 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-eyJRDuKR_nGmPFjuQNYirKsbglprFld_cyTW7XiNGWo.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-Zb2FrixvvhZLBeIb-wbcqzij2wzLlshrVuYKJ5ptxqk.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-ZCWtgAjg4Tv9MmvFh6iWw_oTkZhxI2BYhKbVoYGPvxg.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3063486025020951071.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FgvaaKT5WFtlrtPF0eyO42iyM7U_HH5naVH9LlXOVpA.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-gHY0CZfzftbyt0yACZ-fcxk_Jd5Gilr-rvoyYgc-1yA.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-9Og7StUYyVA_mpXrcmSwhEGoMgSpDXuUZca5UEB0mfw.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-tKRpnD65kl1-8ioEa2OXsHxutYyWsH2HbJoNZGrHftE.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-qNs-9QlBr1V9mKISgiSdxeNXX57HZBbCFx-3vmUWe9A.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-eyJRDuKR_nGmPFjuQNYirKsbglprFld_cyTW7XiNGWo.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-28_GKxHFh2xR6VaOUGcK4DcYviSU3EDsXaySH7A0jos.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-AXEvu5KjdJvHKBQM3fsOnMNBSXS5KyZ6U_c_ceCI3Vk.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-Zit1EINjI9WflUZo-ku5LzOtaLCI4FPympTtqGAKXTk.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-iW6Jz21srfzX_d35hBZ4as8elqX4qzY8NcyzMxEVegc.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-wMTupSdo0WqpByufDb_CeGFi_QEeMS7s3n1C1PIluQY.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-C9yQfak2VCE6exOqYTYGuz5qVcATM6b-CAe4wqe3Oo8.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-PN1b5_IRnGQ3xYWa_ABB5GQ-0PSiaKvMA6498JPw47c.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-J6bri2AXO2XOiKFA7PCFrQ6IAqauEvqnO_-jd35aA6c.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-ER3ofiNOogiZC430TaHQVS8egQTemjvRbqEg5zR9J04.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-8MBuf48FXKMs2m0YCbMSrvPt1rGSbLwfMQpR1cpBYpI.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-odaF1NPBWxdJEpIw1LLRcWS237SAZxbQxbJ_rXy9JS0.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-BC3_LMw8Zv_1MLcti7a_HhPwAwKI_f-Mxb-RpLFXrWg.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-meMfBuxuApbZs59bglFYLeBcU1HskbwFsz05_hxFMVo.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-Q3NEPVL6X403RgQer0KpsG8aBmRZAabgPORlnCO3Zns.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-EgxNPLi89v3cXwbKriWk6cVCtPpREndKjPtFFUC99Bw.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-XxY70ZwEKVeZk6Ix288T8zRb9H44fxFOpwIH0V7sK98.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-BLVFtU9QpqzGNJB2QkoYtpLKoMsoEmJev-jUIofAigA.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-5r1B9VYaXEEmG0c7r80_xn8vieCcPu1v5Fi2vHLyqng.jar
    Aug 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-vKTEfTV_xRu3TYmdD2Qc3goaGky4On_ghIytI4GnLxo.jar
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-k4N1d4hAM87QhgvasSYqSHnKABYAN1pV0BkSTFnNwk0.jar
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-QvN7r1OII73vjMZ1kqRaHpGfFQtVrFvS-Ely4wOi-cE.jar
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-GwooX831CNrYnCPiE2kKjLknQ14_sj2LGNBeLbiFTqQ.jar
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 47c2b700382b58994410be4c05e7039a4ea1e3281a40b5fcccd1c4ca9ba4b6c8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-R8K3ADgrWJlEEL5MBecDmk6h4ygaQLX8zNHEypuktsg.pb
    Aug 03, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 03, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-02_23_45_21-15603525525553818066?project=apache-beam-testing
    Aug 03, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-02_23_45_21-15603525525553818066
    Aug 03, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-02_23_45_21-15603525525553818066
    Aug 03, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T06:45:21.397Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:29.842Z: Worker configuration: n1-standard-1 in us-central1-f.
    Aug 03, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.669Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.714Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.748Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.833Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.863Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.894Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:30.963Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:31.617Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:45:31.692Z: Starting 5 workers in us-central1-f...
    Aug 03, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T06:45:52.935Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2020 6:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:46:06.158Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 6:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:46:06.200Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 03, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:46:11.599Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:46:25.726Z: Workers have started successfully.
    Aug 03, 2020 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:46:25.767Z: Workers have started successfully.
    Aug 03, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:47:07.382Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:47:07.598Z: Cleaning up.
    Aug 03, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:47:07.701Z: Stopping worker pool...
    Aug 03, 2020 6:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:47:57.489Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2020 6:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T06:47:57.538Z: Worker pool stopped.
    Aug 03, 2020 6:48:04 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-02_23_45_21-15603525525553818066 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1533e394-198a-458d-ae70-6e531de5dc93 and timestamp: 2020-08-03T06:48:04.866000000Z:
                     Metric:                    Value:
                   read_time                    21.252
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 6:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 56.082 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/jv7yz5oj2qioc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #824

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/824/display/redirect>

Changes:


------------------------------------------
[...truncated 294.23 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 12:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2020 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-iVLHR4pJprS1mS-ZyPJr16FMj8bWyCyX-DxlhpxgN6U.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-xQxWn5RQ7R47R7uyqPxekDzak6lwsUs2U0fFEu8U2NE.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-1MkBDt9Fcto34GevcM-mmie-3gIZVeWcjeoM9YvEJOc.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-lnczDzFmhPq9h-7rXqRoZSLsolh1tmuqopCBtL8m_Ko.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-7OwVHdRk-7hyTYawc8lHgnAbDZwT3KQ1GsMPqOGxU3Y.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-S9H4R17PcmjG1exE47YgoVBc9QS6dXymJTTEMw7ONhg.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT--1jQjxLA11K4YixCsKDCn0itEA0QMwz7oKbmeNkMZQs.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-L_dyzq5efF4LgkKbIYYnbQ7vE_dXtuc3gZSgvBlQ1eM.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-eG5s-IOX-UHJWkmGGnhnWK5LaldY5F7Yi1LUNKOfRbo.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-5dQbGHvV8tpd5S4LJUCq8L84gB1SaiFkNy8OgqBhyAw.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-iVLHR4pJprS1mS-ZyPJr16FMj8bWyCyX-DxlhpxgN6U.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-_U-YxhjbQIUc6EYO-ZCbw7AIOIfy-qRkAZsagWjlEkE.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-vQjJgt91aNzuqFuwBY8BeN1A1Wzj_oi678iB8WCkrCo.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-8oUim_LuxtA7XmLVv_0WOTFSx-WphuqeRBC-XixsuDw.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-cokGI966iF_5uz2CGASLyMKX4W_DtMp5s-BGzvjUgC4.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-htGJRyTlfmTa4wDy9DylmrVzyH9Ct89C3CYZR7tNXRQ.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-G5XJB5wYGcAtvKpAMhgDn0aoG9m7phN8wwLN6Y8QBU8.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2922978460355457157.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-A3KMO08B2Wu3-UH3qw2juiIbpXhypTyAm-e-KAGlbjA.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-cMs-G9zOZJcpjGtII_LvKWcOd6XBst0ZoeCbwNV7T8k.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-XyggHlAVP6GEQIRmV_sF4e6MZA34pXn_LQdoLn8vgKg.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-6XfQdTw4hX_JB_oGf22Kj_fqKOnx6wjlGFm-USTcLvM.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-SYN6eLzqhKS0xkAyeCwiNEtrz3akH2nSDIIc15tryhY.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-QYIe00bq5w-OkzDRBBpTermveH3rtW_alA84DyISw-g.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-8uHVfaQ8DJvLycBb5uhKPcvwRefYB74ecY1FSJMF9EY.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-DfHGbOWq0jgX_MMunLOyqbKBjenqGO2sBjYcoj4bXY8.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-RfYxuJXtpnHxGno1rJ_XhaxOHlzzym7Iq5h7RvgXm0A.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-gZ6Lgn0vt4s9WM0AmSjr1Z29Lpx2jqnuiVAS6y8ron8.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-TKSqEaMEqdkZlnVdjQpwD2yccatW83nfVHuMuIhFANk.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-WQv4SkiXuCWaWY2FjAWQgfrF3DrqkQVFaGLmyBNc5RY.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-hgCBjVCR2-6fTYlIl6ow0hHSQqLHzhzzvFVTlhe3_o0.jar
    Aug 03, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-OVVXhr7YVy5Y3nWWKiOx7knYLFxY_w97jY9_ZXy8pYQ.jar
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash c9694c7078ec261f2fce36121ccce514f5fa26fbcc6fe8685d92e30d0f107143> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yWlMcHjsJh8vzjYSHMzlFPX6JvvMb-hoXZLjDQ8QcUM.pb
    Aug 03, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 03, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-02_17_45_17-310732360619070997?project=apache-beam-testing
    Aug 03, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-02_17_45_17-310732360619070997
    Aug 03, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-02_17_45_17-310732360619070997
    Aug 03, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T00:45:17.610Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:26.000Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:26.749Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:26.834Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:26.876Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:26.953Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:26.989Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:27.024Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:27.048Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:27.387Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:27.467Z: Starting 5 workers in us-central1-a...
    Aug 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-03T00:45:47.889Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:52.066Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:45:52.100Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 03, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:46:13.287Z: Workers have started successfully.
    Aug 03, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:46:13.319Z: Workers have started successfully.
    Aug 03, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:46:34.766Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:46:52.191Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:46:52.377Z: Cleaning up.
    Aug 03, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:46:52.460Z: Stopping worker pool...
    Aug 03, 2020 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:47:53.327Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2020 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-03T00:47:53.377Z: Worker pool stopped.
    Aug 03, 2020 12:48:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-02_17_45_17-310732360619070997 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0a38d8d4-625f-47ec-b3fd-a8cd1389e741 and timestamp: 2020-08-03T00:48:01.659000000Z:
                     Metric:                    Value:
                   read_time                    19.341
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2020 12:48:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 58.516 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/2qfvnpadm6fxw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #823

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/823/display/redirect>

Changes:


------------------------------------------
[...truncated 291.70 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 02, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-rXHsQq1pYx2eRl4QZUQ85QdHzLojWcgfzvHmMh6h6bc.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-fiEW5SKb3dS-Wyc22ejSswFccBeBJfxt2Hyz0PhY1P4.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-ANiZkOPTEFFROG6OiZARRTQPzZhD-N4gjfiss6mlHVA.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-GwJMed1D6gtG72Y-kpZMYIqPa0-I4NvuoVC3fZEL7Q4.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-9KOCONbu9jmgiYufAob1DkwZr0vSw6Y6ECWNZO_mKvw.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-BwKm9T8VLhVCvNng8o7AFMQs8PurpDtgYu9dnctiK1c.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-z7CS2Uni3ZsGFamnMtEhBiCnKT3lvRyAmBs_KAv_IOQ.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-fHXffiH2Dk6LQHq25IDH41rvw8CV7FCYSNdUcyDBHyI.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-S6bgH7kHQmSdOWr5CZj7j_atsFa1FtqhpKJ_WrMoLLk.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4207925073157116350.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-D_a6whNM4qejZSk7q1sf3hrFui0i09RN6QzoG2kpDdg.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-hG13wIJznfAx_-AHfE8Q9Xcg8247vFuhwYnKwmqte28.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-UE7q5wPTlAnsZMhc0JqxUZmgfz67yK2ZBKKG4GX3slk.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-yWWpLLRg-o6qeMDkqyxdpJn-ORaXaeZNbO1ojR_IveI.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-C8V2L3m-3aXTDtpSd-wovy7vUnudsfn2ErtBjtSSD9I.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-_snU_GeBNWnhR01gsgp1ZByb8JpYGq143M2KGULkFp4.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-yb5-7E_K9rQfbajvIn7gsgu0j81701yA4xeN39seisU.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-cnZRSoFCYCUHAk8Sk3KjGXq2FTkuOofE6MIplw9r7Cs.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-rXHsQq1pYx2eRl4QZUQ85QdHzLojWcgfzvHmMh6h6bc.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-xWr2WIOn_sKff74SnGnmtwYbmmTyU7asKUKvPjAYATw.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-NslLD7ZNwK7AVe2x4sN7h5faGZ25sHPlLxK_a73WiYw.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-aXnts2n29QwXoD6xQ_WaRnasba9qGR2tIBrFCO_DkoY.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-iP6gbUmpVDU4JGj32tJzWRbrhOUwNq9KjKfsdt0fEXM.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-6ILv8vq-pMuLKp1KuxCmv6BiCK_5Pc0d60ugHK7Vmhc.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-_DRsoMqsO3epJiYKzcp3aFK6ukxWO44UYPmuJcRK6pE.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-9iRVYvcVMhJIeGRAAmskxVul8dqceO62hq_eglwHCxk.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-bV0Bl9WO0SUJ5DrRemJng2TN5IQnG3VWLY0031JNxrs.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-92eXzC05KbiOWxofcU4stC54XVePB0WBXmNUNFgjzO0.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-_7NKWeUcSbgTZpPubVDRPxVP_kOZFJo5qAG2EcFUndg.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-fXMrFTlfwvQa4ZxsXzSLLHF2WL7RkygugO5APeM4RMw.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-BnAxmT_ZdyWwJuVJO-Tl1qpKW_l9-dcDrT09UQQFKV8.jar
    Aug 02, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT--xwEqkYn6KGl1RNEveqqaZC8p0ReqKsyOlEbXreZRIA.jar
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 0 seconds
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 55fd9a32294aa8442826efb8493873e91efe5b9a1b5d8ea7d3dc78e98c41c504> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vf2aMilKqEQoJu-4SThz6R7-W5obXY6n09x46YxBxQQ.pb
    Aug 02, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 02, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-02_11_45_25-11552063919498003225?project=apache-beam-testing
    Aug 02, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-02_11_45_25-11552063919498003225
    Aug 02, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-02_11_45_25-11552063919498003225
    Aug 02, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T18:45:25.810Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:33.214Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:33.875Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:33.915Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:33.931Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:33.996Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:34.023Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:34.044Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:34.074Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:34.506Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:45:34.586Z: Starting 5 workers in us-central1-a...
    Aug 02, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T18:45:43.275Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:46:01.395Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:46:26.795Z: Workers have started successfully.
    Aug 02, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:46:26.822Z: Workers have started successfully.
    Aug 02, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:47:01.417Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:47:01.614Z: Cleaning up.
    Aug 02, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:47:01.682Z: Stopping worker pool...
    Aug 02, 2020 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:48:01.653Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2020 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T18:48:01.697Z: Worker pool stopped.
    Aug 02, 2020 6:48:08 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-02_11_45_25-11552063919498003225 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c2fdc7da-579f-4305-baf7-786dfa260fa8 and timestamp: 2020-08-02T18:48:08.804000000Z:
                     Metric:                    Value:
                   read_time                    17.905
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 6:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 55.7 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/gjupxj4qeuyes

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #822

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/822/display/redirect>

Changes:


------------------------------------------
[...truncated 292.36 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2020 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-jZo4EiAx2R43RT_dBIOpe1IKuir6wErGVMZ3nY_7vJs.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-58F6yP5_fwPgu5NLPVwrr21q3Fig7QLOoRuRMI8F3OE.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-4uOT8LdnWbqKc8cXdjC6DV9x1UYmtSJjhQdoF0ATFjE.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-XJZmqla2pWe92i8ccTARHQzF0ysif0ptKSjFjkilx1c.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-dCahFKXKqwlm1XqCGDKJwXgSUruX_1D0vpoPBSc22Fg.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-lTKDLBcdMltZcAPmjIxoex__JE1SX7w02Mp2Q04D9LU.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-Wf7xcbWOs6UXcoQo2588AKqyyVu4ie6k3WcgFSdyUEQ.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-jshn5NgnMHTBf7BdZ7usAED4zPzfPemE8rQoA0-zJ10.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-spGrx2o9l6USDdMqkHCR_kUf8kwIHsV9UHV3Ocgslls.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-QWP7mg6OaVjSjghMS8azM5NZJqh5km2qScXojy-YnTU.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-jZo4EiAx2R43RT_dBIOpe1IKuir6wErGVMZ3nY_7vJs.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-zSeUT1dSBOSVegr4IXnuujWR7hnITh0aGua5-DStmhY.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-eLZJWSDBO2AWV4ewj_yGxBtsuAPt6YfdcuxQHBzNdOg.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-H_o-AR1mt2VBkwTF5h1rZJaBkQh9ZiDYGfplEHW5LQE.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-ADsrerYkAwOIIDtTVVH99wfJKR-dhLXLgt3xAav_05E.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-Jz8oBM75K49nJH23VUs-LpdDsfjjEvrHicPbYCZT--0.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-CkC1pVMU71UOum3733hvJYY4nMWlImCZCcXq7IZPmLw.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-JWIt94q607eW6LOI13i2TRqQYphViGQPjs0mJGE5dQU.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-I4Ok0o42y5dh2zEDf19UKXXHqGt4Jb2LYYM8WeQ3hoY.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-6DkQeyzqS0E3q7Y8Vk84aMcBNINgWYFxrfys6mwGY-0.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7111224783925949928.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-CE86LUzBArAPJttKore2w-ryQWNT8cPTbQnorDIt1cc.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-VPw2SuMzPmJi0BsrnyTcvqlMERIgt1pZmHVGaE79GqU.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-NWMVPZn_oaVZ-KojNFFLCErsI8MaCillze9TPswV9Gs.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-bMceJ57P2ZM1plQWwCgueDMAwvLmmM2SBypXhWO5rpA.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-KYSIGQx03ehSNEldBtKrCUFsPS8KsHId1SJTpyJPgxk.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-5hVo33k_S-OHoSjZfSwzPb8ORAY3xiUz2TEPgaWYVyg.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-1P6ZDVf4LDpCRn53_IKB_0Mm__BUfSuof9ceyuk6WI8.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-R2RePeG_a2ussrk701zx9-A9-m5luFtXTYH28jt1T2Q.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-SVOMhIZdFdiP1BEirYfxUaaF5Hh8eLakghFyXvcTwu8.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-eklLx25eWXhAQMprLpOd4wWRaMh3hkuIn6m5juzbtrE.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-aYbBb4Gs-gc-olwUKzCo6IuW3rpa8s80Sbld1pfJkjM.jar
    Aug 02, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 184 files cached, 31 files newly uploaded in 1 seconds
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash dea2132bbeae18d2b55b24521b672eb7340dd077e53fb64b1261d2ff3a956726> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3qITK76uGNK1WyRSG2cutzQN0HflP7ZLEmHS_zqVZyY.pb
    Aug 02, 2020 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 02, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-02_05_45_30-12803844370800601421?project=apache-beam-testing
    Aug 02, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-02_05_45_30-12803844370800601421
    Aug 02, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-02_05_45_30-12803844370800601421
    Aug 02, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T12:45:30.947Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:38.853Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.565Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.609Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.651Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.720Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.751Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.789Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:39.813Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:40.219Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:45:40.300Z: Starting 5 workers in us-central1-a...
    Aug 02, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T12:46:09.800Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:46:17.688Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2020 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:46:38.492Z: Workers have started successfully.
    Aug 02, 2020 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:46:38.526Z: Workers have started successfully.
    Aug 02, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:47:23.050Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:47:23.253Z: Cleaning up.
    Aug 02, 2020 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:47:23.347Z: Stopping worker pool...
    Aug 02, 2020 12:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:48:21.835Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2020 12:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T12:48:21.880Z: Worker pool stopped.
    Aug 02, 2020 12:48:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-02_05_45_30-12803844370800601421 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d4811990-302a-4777-b501-8b9181a60071 and timestamp: 2020-08-02T12:48:29.474000000Z:
                     Metric:                    Value:
                   read_time                    22.971
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 12:48:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.044 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 3 mins 14.522 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 12s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/qwq2qqoxl5bxy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #821

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/821/display/redirect>

Changes:


------------------------------------------
[...truncated 292.05 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-ocEQTpn4fEplvbk6Emr5ZPiCaPPQFtuJ03U4HCDm9VY.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-GQahVsTFMkvLoD2qknf74x2Dxioa_-lZJcaJYF__KrQ.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-ipCjQ7mj7O0xIqli952myxKW17BN1-3ETmBkvflCfAg.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-aI9DxvrkL-P-7EFcyNbfPL73fT8wPvVlzLTWlURZqPg.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-7s0FOVP-w7Xb6cIAhzm8aG7RcD886Q3LwEZe0Qyy3ac.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-nqBBPBCvoYjyjbL278a-w-n7-gjGTKagDNV8n_aMccM.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4076151495651604878.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mGP98UvGG-fYQ6HeADVWZNhfLDFwpINEN6FKXFSQskE.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-nFsXxkdu_U312RoaobDBUvsEBchS4ro-lMkdU8ot7PE.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-8YgAGz1FhyQ7fBw1B7TSzyllyGRw3MKWPkSl3VQWAJA.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests--V2y6rNoJcMdsGyccEsgvVXtveCw4_fVndT0gj_LsDM.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-lCcQS-uQxSvmJSJ6d5D-O7O_DiX11vvHaMuuj1_p80Q.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-wMjk6clZsbcn_NH45TT3Uquqlj9e0zj_UC53ZXYfHJ4.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-bi-hOUcOrwPuSXqCxH6VwHNr-tmJeR18DgVvUrsTZ7U.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-ocEQTpn4fEplvbk6Emr5ZPiCaPPQFtuJ03U4HCDm9VY.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-Rf0m5N5PmfLPYkK4XbiyUPO1wpeR32lPvBAF2c3bCGg.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-d6qOTbk5LYZi6DLPX09HdDSfKu_FDBukiMFQS5SyrKQ.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-lw7--77vZL0oAWyhRmzyVfl4fGI3D9Yf2pNvg6pYBnk.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-WKJXNKBpUtCieibZDViMJDh-BoyKoNw3XOXNSVcKaQ4.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-8T_gh8XODdhxU1g_qBy2cJ4L-Y2axOiw5OZr9KUR7L0.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-nkhYYmAjkSsTfzZwJo42n209OHvnA1rkpOJtRHEA0y0.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-e2uQrPqfmhJxfW53NbjiKaB9SRO2ahweKYOJ5ndR_Pg.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-pXixZWkL4sg_Dl7kSOnGmEJNrmw9iP-1NoK_YXNa3OU.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-TZhnZ3-xyYTxrIY0xC7PayXhycbR7RVZEBdZiZBMjdw.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-AdzRTkxekIASOuDkQy6v0pQo3cHSD884oJku6U64JUo.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-uxTKA0nDd8i7SsYk2S8KF2eGQVz8BEEd_d1DFkFBGzQ.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-ZM2lJwIeAC7kfL5vyM_FQNan8TWC5N5d9jt_obKhiSQ.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-UKKqS9UxwrLyOMye5_x4LMNNQk5kkqt9lRuXPFxJE7Q.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-OAOfLanwlllwfi4tVm7J5892Nj3F_yEPPYQPV_wJ-4o.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-6AR9EJZeCqc0eP0FWJJ0KWaxAD3b4Zrkd2GNCAUOHYI.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-wO6wVVDpUVEdZoifOZrgLYGDIojuT6E09ZoM4NeRN-4.jar
    Aug 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-WMtzn-9lwcY8Sfb0mehpXmCtAg_UYlSrhIP4h-WL_ag.jar
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 2ef0876cc734937990aba7c342fff79d08b2d65c9e4ac171176f273244598c59> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LvCHbMc0k3mQq6fDQv_3nQiy1lyeSsFxF28nMkRZjFk.pb
    Aug 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-01_23_45_16-5392378404118667027?project=apache-beam-testing
    Aug 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-01_23_45_16-5392378404118667027
    Aug 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-01_23_45_16-5392378404118667027
    Aug 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T06:45:16.922Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:24.568Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.274Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.308Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.338Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.412Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.462Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.494Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.518Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.886Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:25.953Z: Starting 5 workers in us-central1-a...
    Aug 02, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:45:51.541Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T06:45:53.035Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:46:17.145Z: Workers have started successfully.
    Aug 02, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:46:17.179Z: Workers have started successfully.
    Aug 02, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:46:54.620Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:46:54.789Z: Cleaning up.
    Aug 02, 2020 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:46:54.865Z: Stopping worker pool...
    Aug 02, 2020 6:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:47:49.920Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2020 6:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T06:47:49.962Z: Worker pool stopped.
    Aug 02, 2020 6:48:00 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-01_23_45_16-5392378404118667027 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8e7eca9b-3720-4826-97ab-f65ebd49b503 and timestamp: 2020-08-02T06:48:00.167000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.814

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 6:48:00 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 56.233 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/y6qqn5f4rttxy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #820

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/820/display/redirect>

Changes:


------------------------------------------
[...truncated 292.88 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-mqcIKrNwOFA8qMahY7yXDOLPplOTbkoXC_T4kHxFhr0.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-mEhCmYdAiG_aTmN8d1Q8K2VUMbhjN1zsjdxnk-BD468.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-zuwC2nYehHZaq723u4YxgJlCkzvy33Gjuq1iutI5Zo8.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-EqgdM0vj1Dd7PwvCXnw9Jd-V-czbmUHFd0MjVPVp8LU.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-cyjVGas-gLzCtLkOgDxDVZOMKuT4Hx2VF7iBZKyATTE.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-rUEiL2q3qXlpSbG6UkOnLQFwPQQsMJYX5_F7MXUWYxc.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-VzyYgQBl7ACNYyQB1soRz_pxydRNQ1DvUjwkJawjk5c.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-mbq_HpmMkjR9yBfVgZYgA2kwxL9GluWcM6HYPhVI0oA.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-9c-62GTSngLyC_FV828LS1cQF0lJ6WxXB56NfUtD0WU.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-ulgbgtPyOME7NVp59osnG76cCP56DvqMHR6QTBp9sF4.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-UwUtvTuLuyhpd86ErwKrxtmTpjw3O3hwDkPZLvsrTCU.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4335710787070450469.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BfzDVl3bUz6qPxf8E_3vwPJtB2bxgvWqGfIBgUXO97E.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-oM9BLGUaKhsnQuNzBP7vujTLoxncEJhARhhy8jAWOck.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-ssoGRot6AUAE3D8yOV4ycmzYOz1rzFtAEcQ8sKj4js4.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-mqcIKrNwOFA8qMahY7yXDOLPplOTbkoXC_T4kHxFhr0.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-oPuJjq8L3Zj5GqJVX-WlHi0JlDzfKJCtT_D1cYeAPPI.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-c0FoPTy5HSE15PqyMm1h3_UCSoHUnNrP38OyAADuRyI.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-v2AKspylDWTpTi8_dMp1wUaR_0xVt0mDuJSQ9HouZM8.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-y8gT8Vc_taLWfkvPsqMwB9Iul4a5LykW1e0kmKePk8s.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-eDqxClc4yoUqIB_2juJR8ddVGtTHhGaHQmsFrXBJ05k.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-H-4QQBjp9Z-UYHLh_JZ0Z0Uj-yHSv1Q2Unkm3Uedrpc.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-fqCTQX5CCDoZD_QSbY4_nMHLrKyvPVGJq7sHzs0Dv6U.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-Ch3uOAPQ_lrBOJz3qXKARqTiSkL5LM6Vj2_SSDefJbI.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-Yz9N6L31UFTWFC5IK3eSS3kMQWVrID2QigSi0YI2pxM.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-ZkkVGkOHhJLgLvoRSC8PlitEIX60EN9bDkKqDvo6Hl8.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-SN6m4wzj8nmqx44mvy9OnbvMbcZ7pCaxpVydsHIFgzk.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-iv9H5nCMc9kVMAgwONnvZSr5ieg6BAwvLyNVjHcibQY.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-JwyEeH2mnyduIeZh69WqeLKHrlp8NB0dRm7RAwkqhVY.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-xXOpk7ohFznWl346T64ixBY0L3WNNQ2SLZZYB5Uwk2c.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-sZ9X2yknF9Mz9gXESIh7iBl66NYo9OBhqXjLJlXGQ14.jar
    Aug 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-wfEySmJszyqiZ-b_h8RspabIYteu1b7TfHoyFOHGpRI.jar
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash a6e34fd2f24c120b05e98bb0436d6e73c1259f82c487dea9cc7669dcf28959ba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-puNP0vJMEgsF6YuwQ21uc8Eln4LEh96pzHZp3PKJWbo.pb
    Aug 02, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-01_17_45_25-3316284301289898395?project=apache-beam-testing
    Aug 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-01_17_45_25-3316284301289898395
    Aug 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-01_17_45_25-3316284301289898395
    Aug 02, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T00:45:25.814Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:34.576Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:35.830Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:35.873Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:35.895Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:35.971Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:36.006Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:36.038Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:36.073Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:36.456Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:45:36.539Z: Starting 5 workers in us-central1-a...
    Aug 02, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-02T00:45:42.757Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:46:01.806Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:46:27.740Z: Workers have started successfully.
    Aug 02, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:46:27.774Z: Workers have started successfully.
    Aug 02, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:47:08.675Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:47:08.896Z: Cleaning up.
    Aug 02, 2020 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:47:08.989Z: Stopping worker pool...
    Aug 02, 2020 12:47:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:47:54.794Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2020 12:47:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-02T00:47:54.837Z: Worker pool stopped.
    Aug 02, 2020 12:48:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-01_17_45_25-3316284301289898395 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d5f0de43-a68d-489c-a1eb-dad679bdc568 and timestamp: 2020-08-02T00:48:01.332000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.613

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2020 12:48:01 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 51.114 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/6xndzffdpsuh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #819

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/819/display/redirect>

Changes:


------------------------------------------
[...truncated 293.94 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 01, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 01, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-noajlMDuq3BZu9wEgbjOeilP4GP3tmEYSXlglVrTtlM.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-eXuU3JKShd3w906s0sZ4Pn9bn8W9-kARtIC8M2nlGgM.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-astZFB1chrAzqrgupt04KgferQUAKvOPFvHPBMsseuU.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-fQ4CgXppVFmirVV6z2uCAgxgVdF1zBBJrsvvYoCfMmE.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-wDoHQ4f9aShRPN4m2tWlgAHCq66gkCVE5EaLDaHF62Q.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-CsPLfTkzvtDwZ5ByJrzjkHDXVx1P6yTzlj4GeZHhsyw.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-Vd2oARZUDm0nI73B2Gq7ahlVBGmVXqTBywtI-ItJYec.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-0q0GHwdSW786Wsy5upEwcQwy3Y5_AurJLsyJaXp6DLA.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-CogKxJgGeV3Wn2iPpffmot_RZEdze7YvIj17rrCrAHU.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-gA9uDUmw7nYS0U14pestNNBI-CIu9s3ajnHJYdutiM8.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-A5TN6P9jbcno3eSkT852wbpaxgu3LL0rgJGP0UNTcJw.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-FzSu5ecT2BinpP16iHSOLHUO7n6yT2Mno_y5b3iaF2o.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-vBWAz9e2SCtk37iY1BO-fsBdEqJUqqGC7Faz_S8loHU.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-lP78ToIMQpu_QfrwcY1wMX6nSfTRRrKimCxtehYfYfE.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8121587996103957046.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qlCB8KRGvK2g8b83ysbte5YFFnZglVWTGbHTUrEa0ho.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-_WHUJbQl3bQ-RsF6dJfASqw2e3BBpJ7gPOVbS7NI43k.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-4MPILCtKtXq8_PPNyBcvR7moxPQ2mU2dzTj_fmrtEZM.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-xAg6h-mc8Dr6gvlGLL72yISfb_IQUsNSAygceoJD0xE.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-6ITfDDrqeydnSiar2yan3ie5GKUpMulG55QKi_8DnFQ.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-auJhqB9lg3OoY7LsFcyBZwokitIDcjI3g3fAWVSSl7g.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-P9ESCBbTOixOTymlY1F8sO1DaAvBofM5FNNjJoJokIE.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-uFRBmXPsFYVRowGIbxMkPRw3m932jimM9AsH1hFz6cw.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT--7bb5dlNw4tY-JxPt5L6KPFOJNpI_fcavdOufMoZzXw.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-418WMxF2tQGvZGGWJ-XQOq8G-AtCglwQiyb4KOxIXQs.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-noajlMDuq3BZu9wEgbjOeilP4GP3tmEYSXlglVrTtlM.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-a8wcf8Pu0Auoc2fEOlq2YMooW4l5FlYcMQ7D0tCPTxc.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-28s2KKxK7-SVTJvQQDCyHxp2PNHvKtVPZd-O_sephBM.jar
    Aug 01, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-DF5dCxGCuYFQfH-tqncyB1-dAMsJAueWzh00KmUX92Y.jar
    Aug 01, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-Ta36vvEaH8s8wAE4Wlen5Yjg_UV_tEAibaJ59B-s4x4.jar
    Aug 01, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-CdOtCUeocDkVigEpKjStJdqMN0_WiPIBqQtWYqC9jnU.jar
    Aug 01, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-tl30KYg5UWEqC5YSeEMM1g5NIDUw4solh2ykJJ_eL9U.jar
    Aug 01, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 0 seconds
    Aug 01, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 01, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 01, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 01, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 01, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 01, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 380ee2e29b545fdfb1d013cc49ea4b04d970bc23604d28658814658871ccc489> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-OA7i4ptUX9-x0BPMSepLBNlwvCNgTShliBRliHHMxIk.pb
    Aug 01, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 01, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-01_11_45_24-506992037371221487?project=apache-beam-testing
    Aug 01, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-01_11_45_24-506992037371221487
    Aug 01, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-01_11_45_24-506992037371221487
    Aug 01, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T18:45:24.362Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 01, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:31.880Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 01, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:32.758Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:32.802Z: Expanding GroupByKey operations into optimizable parts.
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:32.841Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:32.914Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:32.951Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:32.985Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:33.007Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:33.433Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:45:33.519Z: Starting 5 workers in us-central1-a...
    Aug 01, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T18:46:01.260Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 01, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:46:08.260Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 01, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:46:32.727Z: Workers have started successfully.
    Aug 01, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:46:32.760Z: Workers have started successfully.
    Aug 01, 2020 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:47:07.756Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:47:07.931Z: Cleaning up.
    Aug 01, 2020 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:47:08.013Z: Stopping worker pool...
    Aug 01, 2020 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:47:59.451Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 01, 2020 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T18:47:59.527Z: Worker pool stopped.
    Aug 01, 2020 6:48:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-01_11_45_24-506992037371221487 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6ca0ba7b-a3b3-4f55-9ac3-33096a76378a and timestamp: 2020-08-01T18:48:09.153000000Z:
                     Metric:                    Value:
                   read_time                    17.979
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 6:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 58.016 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/4boageizhwcra

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #818

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/818/display/redirect>

Changes:


------------------------------------------
[...truncated 293.36 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 01, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 01, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-SQigBC3tTwShCpFMrso3bGP46scwtiiJ6zQXDpaW9FI.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-McPHUVCkCIYm279ZQiXDiLPv-g59lFp7e9AVo4rUsGY.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-qgegGQc2NX6biGj8lhf60aUO7WiA5sP-Bn2mILnoPoQ.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-zk0tg_LDLRp_JRKSO7ft8lq3kqLqiec8M2NCbPUEa0Y.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-3ZoeWEgvkIgREdrPBPj6hq9aRouIRqEnZlRr3MWaVrY.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-CBtXI-fZYgv1lFBWCW9b2nvrmeUlJXNkW2WrcXb0LE8.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-ffchMKDDGj8PsWMsPRNBqN_PS_8da0zz65hU5HZ8jn4.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-jQWScV8eCpG2RtcHT-kjhXfrfHMFuDT8PPg5UG_02SI.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-yR5piBdxZc1u8y_xCJwLPIen9wZ91gqCBOlJc6vCupQ.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-hJMCr9dIUcCq-ROHTLbHItky4sUnrWmnLI4lloHqxwA.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-CZ5XBZg4YRb4o3uiOmzU5C9j2f4AKs74d4yXDVi2hDQ.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-q12B3taP_3YKGakOJu66VnK2yAqv5vNwwN-reOgkFiY.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-7upSKqQVpTC0NmU5tCoOde3kaWw3Pd4WUyi5Zzv-6XI.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-C_qJZCZqB2BVe24UHXWdJd2Cyj6n9Ud9SBTS6XWg81w.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4334979043326842090.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TxGRa8AbIlZz8TB7C3GIUeyLI_wqfwyqssGi3pvTxSc.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-PerywSeti3NcW9Ga4vkAucApzBPBVngKj9pUU_oKP_g.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-qzUIOwEsSyYMg0BReivhef4uXRjx78c_c6fblDmDaY4.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-jM5O4aFBQX_2pF_39t2H9gZWtPgBtestpU9cSYkms14.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-6EJMAt3NZIdDQE8Q98BQqT5Zxxf0t7eIx-i-1GJPbAo.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-haZRbgokbPiMhonKZB6rO95xqg9RO6zYYWNPgiJtcCQ.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-XFYDy8Wj8Pe4mbOuvsdlHJmCN2janVaeNKu2glQIzL4.jar
    Aug 01, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-qgegGQc2NX6biGj8lhf60aUO7WiA5sP-Bn2mILnoPoQ.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-wDNGbs8-Iixcq74Z7Tha31gABLyyOdQnYlKKSSR36C8.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-J9x-c_sU11ESUae_lEco9K4kFT6emUN0CoE683av_Yc.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-CVeOHOjUiw0rLNjqjNykU57CZS9fQ0Zk0RPr070DPCw.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-6Ys_x2JW_F9WRMKV0v3Uno4R3Z5QfC1_GijBRyUdbJI.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-PI_AuuCvNc8Q0Kc7EsCzdEKVGs8VqY7JX1pIORboRpY.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-P9tgSRm7ubFQ7SaGiYlcmLKNMgZgTHFKwwXEB5A30lo.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-NqgidoAue-LBQMB_6rDtU9ERycdPnmurqLfsM6aurzc.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-l9p81Jb9gloFF-tcshEMiGy590hrQOiF1w_8Iy78988.jar
    Aug 01, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-K1J5H8o6KAaBPxX0PWUG9ksth4DMHxeuAZq75f2cLXU.jar
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash ffdbe17284c49dba20e1ba951466a092375b23d1a029527ed910b9fbadba5ca4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_9vhcoTEnbog4bqVFGagkjdbI9GgKVJ-2RC5-626XKQ.pb
    Aug 01, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 01, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-01_05_45_43-13583670425329458241?project=apache-beam-testing
    Aug 01, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-01_05_45_43-13583670425329458241
    Aug 01, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-08-01_05_45_43-13583670425329458241
    Aug 01, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T12:45:43.821Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 01, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:50.815Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.092Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.132Z: Expanding GroupByKey operations into optimizable parts.
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.166Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.239Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.267Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.303Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.328Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.632Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:45:52.709Z: Starting 5 workers in us-central1-a...
    Aug 01, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T12:46:18.964Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 01, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:46:20.279Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 01, 2020 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:46:42.582Z: Workers have started successfully.
    Aug 01, 2020 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:46:42.614Z: Workers have started successfully.
    Aug 01, 2020 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:47:24.574Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:47:24.747Z: Cleaning up.
    Aug 01, 2020 12:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:47:24.829Z: Stopping worker pool...
    Aug 01, 2020 12:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:48:22.055Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 01, 2020 12:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T12:48:22.099Z: Worker pool stopped.
    Aug 01, 2020 12:48:30 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-01_05_45_43-13583670425329458241 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b586963b-2038-4f44-bb3b-61a284a2da4f and timestamp: 2020-08-01T12:48:30.657000000Z:
                     Metric:                    Value:
                   read_time                    23.529
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 12:48:31 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 3 mins 0.791 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 13s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/h6lzbwoztdejo

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #817

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/817/display/redirect>

Changes:


------------------------------------------
[...truncated 292.94 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 01, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-NPRo29oW2TusMY48jFNky8j4pUi9xxxR1gfKnPAB1PU.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-upGIaZJAdIFTxAclOchzcHKiDc5LBFP5RjYliY4jAQw.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-s5OpTjtJ8eS_0gDWU78NJgVmRhdp5Q8RkF4rWcrlDL4.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-Xwgu78HOIhiyZ33A216q7rGgSHKEsqgxCAOXitfjzYE.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-kg1EO4sLHBBrp6M9IlPXo7J8iXIwFYI2N3oJp4G1yUc.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-GK2r0SHbLgQc5hNwiHFUnSDg8mzOOJxdlApM5bW_qds.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-sMwHWuYwjHM7HNImhkvLuYZvKoKiQSu7-YF9RzofDQE.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-NPRo29oW2TusMY48jFNky8j4pUi9xxxR1gfKnPAB1PU.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-pdqp1-QHiTkF-vZSDv6B1CyUQmAx2d5MGwMf8nfp6L0.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-S8Nw5ChfyMdGZ4m_neJ-66wpEVtgMoSbVXfK6WrjZzI.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5410312702911316990.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ubn39Uw__GRqJwyuEKPwUSGRnE9VgJsBVbWlP1_EPHc.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-hA0ARgTIp7dK8pqcH_AkECwEbVrdcKb5NMkCNL2WOBc.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-UCDkv3-TDVkf0D7QjlS5pAuKHI6LTFXmF_ZRBy723U4.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-_LesFOVn88SD0xSRSTbvWPgI2B74_9AMyeLucgAU2Gw.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-eSP2B0AIGZ8wD7zKTHlIauqcHpkckZP1Xzdah-1YGsQ.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-T6DWUwkDRORr2OgiUBogKs8p1N6A5l44r5unTgWi4Ho.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-Y8KWaFh5dyVjw0Y5K34USQw9Dfyl3aLx-YWDMDjJpNE.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-F-Z9XzHzxJB3uY1h9vZMv9Mzx-tBZ6zBjRSDTWJixZQ.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-vN-8ReR5-NrUm2wIioAS9_xK7-AD72YmCjaKe17LkO0.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-Uizvib7k5e7k-kG9CcWZD9LznGPoOgHtN6-aY38Z1LY.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-HXaLiO0tv0ukYzETBrK2KYKmMYOEKoBcOfBDBHJoyoU.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-CiZO50ObEe8L6K2V_mL7iknkDHRUoaiWLLOiMPPx4xY.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-2n_xPnvCt5KQwqaLpUZ3_lcH8PwF-9hlnLkp1KQ3M38.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-UfRSd2KXzGSCXyeCznZqRCd_vADfHgAfEJ-J4V_b6QE.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-IOhZDRgWZ5Gtt8uJ0qV3L5zMjo2eDs6RI0RsjsYuXB4.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-a2jeHTsuZTeIVtLGGOUal1Wf66Nack9dyRIT4jDYPbY.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-8GlsjvKvJSJwdDgwiG6JW-jbYMFkg1Ojp_TJavbvJcU.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-4w01u3BoiFTKziAIdXpLXe0EJ1DOfOaovcN5x5nUiE8.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-7iLzj0GH-gqAYVAPDHSe-3mZ8b3oG2uXE1BpS4m9U1E.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-LnBQAzNurGz_gizDAVlzN5pfqM-xR3-Dt560WdwZ0Lg.jar
    Aug 01, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-RPpRBsVfv1lBUb66HF4Kxm6U0shzFyCwYLKKGMlJ_Do.jar
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash ed7da59770f0263607622c821cc6daa1bfff2e14946329e507a1738d9e6fc5ae> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7X2ll3DwJjYHYiyCHMbaob__LhSUYynlB6FzjZ5vxa4.pb
    Aug 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-31_23_45_17-11543627634935570102?project=apache-beam-testing
    Aug 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-31_23_45_17-11543627634935570102
    Aug 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-31_23_45_17-11543627634935570102
    Aug 01, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T06:45:17.266Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 01, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:24.854Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.609Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.649Z: Expanding GroupByKey operations into optimizable parts.
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.682Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.737Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.771Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.805Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:25.839Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:26.252Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:26.334Z: Starting 5 workers in us-central1-a...
    Aug 01, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T06:45:39.900Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 01, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:45:54.173Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:46:11.675Z: Workers have started successfully.
    Aug 01, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:46:11.707Z: Workers have started successfully.
    Aug 01, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:46:49.237Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:46:49.407Z: Cleaning up.
    Aug 01, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:46:49.509Z: Stopping worker pool...
    Aug 01, 2020 6:47:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:47:38.060Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 01, 2020 6:47:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T06:47:38.106Z: Worker pool stopped.
    Aug 01, 2020 6:47:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-31_23_45_17-11543627634935570102 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a1180582-a815-429f-9ff8-2d72a03cb4e5 and timestamp: 2020-08-01T06:47:46.597000000Z:
                     Metric:                    Value:
                   read_time                    21.797
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 6:47:47 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 43.17 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/a5flg2zrnv5v6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #816

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/816/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-10591] Fix flaky jdbc test by adding retries to container creation

[piotr.szuberski] Add logging after catching an exception

[kcweaver] Update Flink Version Compatibility table

[kcweaver] fixup! Update Flink Version Compatibility table

[kcweaver] [BEAM-9199] Update Dataflow --region option description.

[noreply] Enable all Jenkins jobs triggering for committers (#12407)

[ningk] Remove trailing whitespace from README

[noreply] Widen ranges for GCP libraries (#12198)


------------------------------------------
[...truncated 294.16 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Aug 01, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 01, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 01, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-byMs-kK8nCd-dImvcrVYxGYuivTAoys39VDeI_vMj0I.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-byMs-kK8nCd-dImvcrVYxGYuivTAoys39VDeI_vMj0I.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-AiMsu0iHz_BTLLQHwUk7wGCgTNEHvjstsSwI2WLEx50.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5608778273654501930.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-oV8El_57RH8bgVFmSUIyJFf-Af3RdvDrcGhNG-NJHrI.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-stivEOMDK_hdOI0X9i6z-FiBLRZcge0BOXbuHR_f9iE.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-FILTkNcbNZGfvrvBpS4xEtmnwhMOD7oAMcZNexycC70.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-7fumPkSbdZEitHsPxDkwqvj6um_9TUUNX42CTHEcD1g.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-4jGiFiAuVcluaX4B-VZt_mvYYCCrz5Mi97wBpGxORF8.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-O59sexjAPljIPLxVN_JPPB8lW6C472rAf0lxL-xyz6M.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-lCMWvtvV27nYzq5HRSPDcn6mBE5dzyF6_eRCsPmUZz0.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-YfRf1Mp6RxvaXi_1J_DwemSMBvdO2njPwmLImZcEF5M.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-PIT-2JcU_GPn63LNbqr7JzV2X--y3OIxBc9Jk1rcNjg.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-WLYbdFacafJIJqVzP0WFePFlaFLEwGkicQXDMPxECic.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-U98UhQ8emQBJvigrAY_w6AvlwUSsgeYsUJyt_PFKXO0.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-O6SmbwKWaRWeop3wEUbn3qtF57RIsOOkLBEI3cAzt2g.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-mv3j1d5aTSDvAkSzwJHf_0xabvvfK6OyOF0OwM8Fs0Q.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-6ZtC_HKh4HsDKQU6UKWG8Dau5Zo0s0C3TE9SopBkqfo.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-XwJ8hyNj1RC9PcgiQFo1bqBa27LVqBsGvw5fxJ4zWfA.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-ZGt4GpzZaJaTpftnnaD9pOJY9V16ZYOC0PLgyy3n0rw.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-vzOMslm0LQ2it8XB1WRyfuHeKiYVvRfw62mjc_WOa48.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-MxGIw14Ea_pH1F-umRcMSidc5g8j6KZ478CXJPg2_o4.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-peDLssGKpZ7h3MLnEDvS-SE9m5bcM94CfwqwQKe84C8.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-vweIOpyth1hMW4vXw06LCv_c_NvS--KpxSZFuHo2vWw.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT--ak5wdk1pA-PcvJh_3P-z2yhG-h3KNL8q1i_tjCcWqc.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-ZxQcl0RZpDMSO_RMwUpFCXHc9SS33xavF8L_E4Pfl-I.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-kOETvq1IYK3v7s4wzHA_Dw7RMC6AIogqx8bqMMam7rs.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-6MAClNVpOo_7IfDQ3jeUvooho6gTmuimPHadAXMeXYA.jar
    Aug 01, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-_jx9pNO9M9mr6HhgCtZdMZCR-PrqWLoA8jG4GL7ejQQ.jar
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-PkYyfFHtopLg1p2FXN8g2r84kiAWfK05MGKandgiUsc.jar
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-eYgUIj_gV97VQ3YBDQlutV--hzI1OOO0bD7EZ1UpQfs.jar
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-C74W89AvPeQWU963lHwV4bu6P3MHhuySEoTbdONVVN4.jar
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 01, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 01, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash c599938522986b05a194c15aa06bd0c3eeee2903b10b03cede983bed2a2ec926> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xZmThSKYawWhlMFaoGvQw-7uKQOxCwPO3pg77SouySY.pb
    Aug 01, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Aug 01, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-31_17_45_35-13682093042142920487?project=apache-beam-testing
    Aug 01, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-31_17_45_35-13682093042142920487
    Aug 01, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-31_17_45_35-13682093042142920487
    Aug 01, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T00:45:35.462Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:43.638Z: Worker configuration: n1-standard-1 in us-central1-a.
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.400Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.454Z: Expanding GroupByKey operations into optimizable parts.
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.494Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.585Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.639Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.682Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:44.709Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:45.316Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:45:45.399Z: Starting 5 workers in us-central1-a...
    Aug 01, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:46:13.444Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 01, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:46:13.472Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 01, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-01T00:46:17.043Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 01, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:46:18.826Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 01, 2020 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:46:31.773Z: Workers have started successfully.
    Aug 01, 2020 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:46:31.805Z: Workers have started successfully.
    Aug 01, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:47:15.266Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:47:15.550Z: Cleaning up.
    Aug 01, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:47:15.689Z: Stopping worker pool...
    Aug 01, 2020 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:48:10.917Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 01, 2020 12:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-01T00:48:10.986Z: Worker pool stopped.
    Aug 01, 2020 12:48:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-31_17_45_35-13682093042142920487 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5fc83257-4ca6-4645-bf6b-cca38306c2a2 and timestamp: 2020-08-01T00:48:16.584000000Z:
                     Metric:                    Value:
                   read_time                    26.693
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2020 12:48:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 55.171 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 0s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/edvids34reh5a

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #815

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/815/display/redirect?page=changes>

Changes:

[srohde] Gracefully stop the TestStream RPC if the job stops.

[chamikaramj] Sets the region when looking up a BQ job


------------------------------------------
[...truncated 294.64 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Jul 31, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Jul 31, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Jul 31, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-TPfuJLPkkkhDAsagnAjvYQ60Lf3u8HDAqq3KSZiTF1M.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-tYT36npddRElX36hP5QQCr8WFVytbHeyOf8RBxnHKA4.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-q_q7xxR2AQAPspK4Inr8RBMdef66u0v6ZFIDuBRYTeQ.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-3X-w8_rLS38dCIm1j-k-KgXOzWX8FCbrLhi2f-3EoZw.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-nQRvEA9399Ei80l8K-_24mr4GiKlVztD5ZTGqS8fHaw.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-CwIkl98uGYtih4CmvHdrNn907Ok80i2MlePFqQCh_KI.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-VgnwmEQHirEYj3uZ-ZmSw49_0PvCouChebEtOkrVaLo.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-mBELsiF_AUDqZRCGF3mYZK73Thwurd-XIGWp-p7bBsk.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-b5PcKBN78JETqUzz0Gw4tkEElDKv33wMxgywnUfw6tY.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-DLFCtcy0DqnVJ0Se34yoUZ6DgPAa_F551m7ilrQ8AMw.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-oYSuW1aCEWVtqqVz6wcnYEXWJYXZxGuaEyX8u-lHPcU.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-821v-HWXV-EAK9ku-9h1hOANIew9gHUVNqoQVzGbczo.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-iL5qhctd5OyX9wdLH2sBT2mShskm4-9wu5Iq5EcjBNE.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-NVaJJGIFVmuTeL9QaFW35m3HEDAWh6yDsGk6aW0sM0M.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-4kif25WuGYjVoHnaYZkITkekVD89rjJDwdYd98XeQhs.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-nysV4UIykjvBcvqqKp-M5dHJRsNwN5SuqcbNiUSeAX0.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-5q7KEaKUP9zJfebYzGlLB4e1gIiLLQxydpOEEaH5hqg.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-Xo4c8W4N1wRa2kuIZICR4CqKplW5j0KhsN3_SjNPWnQ.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-PSTdfzNssWDE3kQ0Pj7D0baFvthaFg3Td3pB5VO5MMA.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-yEhDWhhd0twVj26mwPsrdqUQPLmGmZZDSjGlsywk7G0.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3280860036987956731.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BT6Zit-m5EIkMkR3LZGo5fJcUerkK0mHJ8UfJd53-_s.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-wZH1_k2FtpLO9NysDFupi8f1j_SPcygArV7rlxDCvlc.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-r0meIE1cHd04Hq-7d4muXgoyvzqwOUh8nyreRH00BXA.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-qSlg-ZuOO5L6SdXt2bS-NyAUMjfrGIqHn3MCPw1GdLY.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-n-cLAzzuzP8_VlRa9e5zbtMuasqjU7nH1Eil6mVNrA0.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-N55b-jqHLMMgwH6_1Od95a1VS83SjXCWUOhDPtRZ48Y.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-q1-fRX5RmOOJlHr0XqgHSkHf8EXUlY28lbUhdRIPMv8.jar
    Jul 31, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-TPfuJLPkkkhDAsagnAjvYQ60Lf3u8HDAqq3KSZiTF1M.jar
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-LcrsKnTtSAhOXVkt5CHwRSUzjANx_R6uJ9_Sk2CK48o.jar
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-ExfXvcYGSNjiR4EGb4f78gm-lF0IIjq7361qjVdZFw8.jar
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-A1r6J0vZFzc68aXQHMFZTwAS7Pj-NPJAs8tqgeoW_G0.jar
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 31, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 900b95a283ec684c82961e6c82a34b3e87267e7a002a6adf4f3bf975592beebf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kAuVooPsaEyClh5sgqNLPocmfnoAKmrfTzv5dVkr7r8.pb
    Jul 31, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 31, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-31_11_45_17-2506957796219847260?project=apache-beam-testing
    Jul 31, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-31_11_45_17-2506957796219847260
    Jul 31, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-31_11_45_17-2506957796219847260
    Jul 31, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T18:45:17.389Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:25.629Z: Worker configuration: n1-standard-1 in us-central1-a.
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.373Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.407Z: Expanding GroupByKey operations into optimizable parts.
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.518Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.604Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.640Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.674Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:26.698Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:27.115Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:27.192Z: Starting 5 workers in us-central1-a...
    Jul 31, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T18:45:36.257Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 31, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:45:55.249Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Jul 31, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:46:15.876Z: Workers have started successfully.
    Jul 31, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:46:15.911Z: Workers have started successfully.
    Jul 31, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:46:56.577Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:46:56.789Z: Cleaning up.
    Jul 31, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:46:56.868Z: Stopping worker pool...
    Jul 31, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:47:55.029Z: Autoscaling: Resized worker pool from 5 to 0.
    Jul 31, 2020 6:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T18:47:55.080Z: Worker pool stopped.
    Jul 31, 2020 6:48:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-31_11_45_17-2506957796219847260 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a759e16d-e965-4bee-820c-5b56a1a61958 and timestamp: 2020-07-31T18:48:04.325000000Z:
                     Metric:                    Value:
                   read_time                    19.988
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 6:48:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 0.591 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/itezu2sxamr4y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #814

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/814/display/redirect?page=changes>

Changes:

[Maximilian Michels] [BEAM-10602] Display Python streaming metrics in Grafana dashboard


------------------------------------------
[...truncated 294.00 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Jul 31, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Jul 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-wkSrcyFXpSf8FR9AmQYBSEUrvqz4OBlKU6UiUpE7nyw.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-bb0hSJ1N6YPQZ59A8LUDWN8b8gRQBZQBN-pgyJGuF1U.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-nclcvWfbClm58tKZovBO7rU15OcUirFAqWHss1q3vhI.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-lW0iuwGOBUwJSJo13nVObOEQAltBFvwkaJH3Akrgn98.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-E8UDJ0f4mGO4Sqj3DA9cIHeNR46104jCTR9Zy-VCODk.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-vtv0F_kG2TGqa9oh8VrvwNt4g5h4NEVadYvEeaE5L5M.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-OStv-F4La9HPkpFQEjnXq_GS1-5iLpGuutrUFtLboLQ.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-316zFMa_Yst2DMuNgkrsrUt4iRC7MXo-OjTupGVYHIw.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7206092466911936501.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9vTogGpcktVgKI5iW7r0ZEUe0DQBAbk_ZGs1EgUklb0.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-l3rf7WsSLp_esETxr9Kw-scfQP4TYRa9DUbj5ZZBARc.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-CgfgHKRuISk5NLD5_OAd5OEg6rNAsLX5r2BUtCplzCs.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-YbyBSqnxdUyfJUYYIokBP3zGOwSg2KLzTbEXu61NRQY.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-t63OgmI8slfKVy6aLRfE1IV9OYtSXarpdM4b6XnpoyM.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-0EnIUaXidWPjEWB8ykIfC11jtflyEYRK-ROkmToVGew.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-d89FaKoTZyzVJNbtJqTMjzxarcc33jMNNPHzU7ZvDuU.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-yMKJx6cA83-DEiWZPao4yZsQEdDXu-LrItn4aOTjqmk.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-d89FaKoTZyzVJNbtJqTMjzxarcc33jMNNPHzU7ZvDuU.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-yMuQ1iAW2UjcGBwJ0Yl9utbQqtCXmep6x8sGn27U-v0.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-vPNoWzH_c7PWvJXndX_QNQ745YNnfudH5dfWGxx6m3M.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-95J1cBlOFeFymRoorikrJ15DdXDIhZVkFHgHJVVjj6U.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-hwhr17kH3rkzKvTV58JtX83mfOWOGsFSZIZtfQuBi44.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-BbAB1EwQ6W9ds3IrvDVzVzNAEsSHeFrDhzveGkvQle8.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-aKjfx5PoM2L1WbR9IrVgBfsdukzjjhtapeGtuju9OL4.jar
    Jul 31, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-cWnDUl9RxMejmilmK9J9kWDds1p0638XwRuaPSuEl_4.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-WBlQMrlQZKcyItrUq9sFGRRT-clm-RAXldr4aXWFLFI.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-DVwu5XM0YtlxgWYsX8YUCEwd3UuKxFKjWpIfNFlx7us.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-qpjRItHqpQpM1Z9mZwAyMsn4nuxzRZGazARHxgMIPd8.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-2NFdwRkYf8WhL4d5ykuGLlY6NdefuwntULk4Zg2CAlQ.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-XlC-0DH0xnFe6erLISV3p_RW4xM_HvhprEwkP0rD-eI.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-94gWFH95p4z_IcDPVPfeU1Kk3iUxop8Wytf2b1tuU94.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-dXZHfOj9RKfQmPT8o_ANwNrtZElW17yrghQ0Mos4_L4.jar
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Jul 31, 2020 12:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Jul 31, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Jul 31, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Jul 31, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Jul 31, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 31, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 0246f3c5d4e715897a73f2ab5e61613f63c3e1c6cd4ca6d7ddcbee4ded98861e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AkbzxdTnFYl6c_KrXmFhP2PD4cbNTKbX3cvuTe2Yhh4.pb
    Jul 31, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 31, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-31_05_45_35-1316866340203069213?project=apache-beam-testing
    Jul 31, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-31_05_45_35-1316866340203069213
    Jul 31, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-31_05_45_35-1316866340203069213
    Jul 31, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T12:45:35.517Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 31, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:43.797Z: Worker configuration: n1-standard-1 in us-central1-a.
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.500Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.552Z: Expanding GroupByKey operations into optimizable parts.
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.590Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.694Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.723Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.765Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:44.792Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:45.214Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:45:45.288Z: Starting 5 workers in us-central1-a...
    Jul 31, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:46:12.594Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Jul 31, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T12:46:20.132Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 31, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:46:33.212Z: Workers have started successfully.
    Jul 31, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:46:33.250Z: Workers have started successfully.
    Jul 31, 2020 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:47:17.681Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:47:17.895Z: Cleaning up.
    Jul 31, 2020 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:47:17.994Z: Stopping worker pool...
    Jul 31, 2020 12:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:48:18.785Z: Autoscaling: Resized worker pool from 5 to 0.
    Jul 31, 2020 12:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T12:48:18.864Z: Worker pool stopped.
    Jul 31, 2020 12:48:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-31_05_45_35-1316866340203069213 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bb4af864-196e-45f7-8629-4182d25ed877 and timestamp: 2020-07-31T12:48:24.689000000Z:
                     Metric:                    Value:
                   read_time                    23.974
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 12:48:25 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.077 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.076 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 2.832 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 8s
106 actionable tasks: 72 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/2pz7vy6f3mf7u

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #813

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/813/display/redirect?page=changes>

Changes:

[Pablo Estrada] Standardizing BigQuery job names in Beam Python and Java SDKs

[Pablo Estrada] Naming BQ Copy jobs consistently in Python

[Kenneth Knowles] Upgrade spotbugs-gradle-plugin to 3.0.0

[Kenneth Knowles] Upgrade Spotbugs from 3.1.12 to 4.0.6

[Robert Burke] [BEAM-10610] Add ExternalWorker server

[Robert Burke] [BEAM-10610] Support loopback in universal

[Robert Burke] [BEAM-10610] Set GRPC recv limit to max.

[Robert Burke] [BEAM-10610] Support external env cfg.

[Robert Burke] [BEAM-10610] Make missed logs more readable.

[Robert Burke] [BEAM-10610] Fix control response race condition.

[Pablo Estrada] Applying fixes from comments.


------------------------------------------
[...truncated 294.43 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 6:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Jul 31, 2020 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Jul 31, 2020 6:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Jul 31, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Jul 31, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-64ffMFj5MeI-VOGYwcV0soWJFy4CplmJX_N6Of5QDcY.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-pJoqfBTALTRpfpMYwXAqeb_mFPNHvg6vtVsrZmoD8E4.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-l8gd3LFAPvDfwYGqddwviIoOorNRWb2R5_IHvUGzr5E.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-PmkafcZt0MYemZgFNVc7_D_weo6Owb5lowhKHXEXqBU.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-9krEovr7kqVj54YkQr5f_hVS5T8gM5DJpHFMpfq2Yvs.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-kUzx8sXQocH4IKZhpnRWe8Uq_iTF3R4kfhrP1CxxFhU.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-HglFnIewZYyEyB97tAlZmucb4HBAte_0Z2pMRLjRze8.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-Mqcd0VIBhIJ8I1hmE5gtxcH9MkHj_6iKXGg2CBKNwNM.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-Wj8xcMwZSr8QMGpkoIxpMC-kRIqXPO4o4aGwAGOFMbI.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-nISy2rhRsaIpPN9AeyWli61WNKkg_iPLG0Q_ZWNeVmA.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-RqHzbBaSbqSV8SJ7cUoxu93PrhrtOpyDWQBkXk-l16c.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-gZETjPyOe64vRLqPnL6tpyqVpmyPRFoPaVDaTblmWGA.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-rw2CbKNPBXn5DR1ab8kdZsRtqbEgqRaw39XplniEWOU.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-AY_fJ3dZDOF0b0lyjqzbDVA8elZkyuy8K6BIjRZZsSU.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-r14BLdZ1XrHdmWE4G6-kUmsAyYfc_2JLrsrYVHy6ec8.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-v7orrRYE91OnSC17rCCgF_Dol8h_E75Q243RLdSpI9M.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-Qk06L-yUziPPFOtErTGOOjiD98F_Qgbm7XWtv6b2808.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT--J3x6iey4LO2kXefhEWModUFpPltH83yLo1K2RUq1kE.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-b6cUTho4voZFIRq1fjArOFacxoY9qNfGcxzXiKsG7-c.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-oTbPgd75W5r3WnmzU__fMZj0hNXUVOBDrMmHl_nKU6Y.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-r2DKlPF9g544hZVezMN-2YryBB81e8lXtHuqVT7zUTw.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-8m_2IEaKSL8i5C8sS1M6KFw5VPFkOMoVwT2VHkmf7Ko.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests--TNvAgqHCMOukymLiMQ7RaXBh2g5v7TaguyLpnyOX30.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-wbB2VpX3MkQLgv6dAu0AJV6CGKQlma1F3hetwgCERU4.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT--BudsY1k2hmLEYq-nhyPUz8XHe4EbAJA3LDYgtbp-yc.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1634460478293610820.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-q4Mo7dyzJAZWrV-erciH2yCwP7i4kJaMIMmZjUJjycY.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-DbHvne2W306iJqJAu4lEYzAO83-u3-mN6zWJurrG8kw.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-64ffMFj5MeI-VOGYwcV0soWJFy4CplmJX_N6Of5QDcY.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-WCODxvCzAKDGWSDYAYSk5OiPRVuWrzkzYSNxoWTHVvc.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-w7ARguq9AwqhJaw5RTVL7jyUaNtAI4tCIG5wyMONqO0.jar
    Jul 31, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-tqHcLGuyqoWOmyASz64QVGsmUJC8Jt3tRHCh2oMkCKw.jar
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash fd70deccc8d9a4b167018314f97f42dc4eb19eff9ac2318c552cea5888be07af> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_XDezMjZpLFnAYMU-X9C3E6xnv-awjGMVSzqWIi-B68.pb
    Jul 31, 2020 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 31, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_23_45_29-17837615335048790051?project=apache-beam-testing
    Jul 31, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-30_23_45_29-17837615335048790051
    Jul 31, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-30_23_45_29-17837615335048790051
    Jul 31, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T06:45:29.694Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:37.782Z: Worker configuration: n1-standard-1 in us-central1-a.
    Jul 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.619Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.663Z: Expanding GroupByKey operations into optimizable parts.
    Jul 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.693Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.778Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.820Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Jul 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.848Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Jul 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:38.882Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Jul 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:39.399Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:45:39.481Z: Starting 5 workers in us-central1-a...
    Jul 31, 2020 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T06:46:09.133Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 31, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:46:10.402Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Jul 31, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:46:10.427Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Jul 31, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:46:15.836Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Jul 31, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:46:38.619Z: Workers have started successfully.
    Jul 31, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:46:38.661Z: Workers have started successfully.
    Jul 31, 2020 6:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:47:20.284Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 6:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:47:20.503Z: Cleaning up.
    Jul 31, 2020 6:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:47:20.592Z: Stopping worker pool...
    Jul 31, 2020 6:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:48:14.437Z: Autoscaling: Resized worker pool from 5 to 0.
    Jul 31, 2020 6:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T06:48:14.491Z: Worker pool stopped.
    Jul 31, 2020 6:48:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-30_23_45_29-17837615335048790051 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 69a752bd-98c1-4bb5-a2c7-ef3a4b4f07af and timestamp: 2020-07-31T06:48:22.785000000Z:
                     Metric:                    Value:
                   read_time                    21.525
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 6:48:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 6.45 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 6s
106 actionable tasks: 73 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/qbucpm2mnnwwo

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #812

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/812/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Convert katas to use with syntax rather than explicit run call.

[Robert Bradshaw] Move class definition outside of with statement.

[ningk] [BEAM-10545] KernelModel and jest tests

[Robert Bradshaw] Fix placeholder offsets.

[douglas.damon] Add Composite transforms to Go SDK katas

[jiyongjung] Bump google cloud bigquery to 1.26.1

[Rui Wang] remove redundent precommits.

[ningk] Change the syntax of private _onIOPub to a function declaration instead

[douglas.damon] Update stepik

[simonepri] Add failing test for count on an empty pcollection

[simonepri] Fix go count on an empty pcollection

[noreply] [BEAM-10559] Add apache_beam.examples.sql_taxi (#12399)


------------------------------------------
[...truncated 294.84 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Jul 31, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Jul 31, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-NlpUwaTnMdgrzKtYt634aIslkp3Ve_wl3FnSB4yzQ6o.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-LopwfCfyaRv9KyY1mpe1jap3NWc_TfGcaR_a2g8xAnE.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-vMwzzkQCiaIK_lSkJ__v4AzS8ToUY1Y3r5vh7RzqrO4.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-NlpUwaTnMdgrzKtYt634aIslkp3Ve_wl3FnSB4yzQ6o.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-rF9ZSXTAyIE1eXa08qknTa7h9AIc93pSI8O6ilJdB_Q.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-lKlA8iYBTCgNPW70O3h7L7EpyabiEl4Z4mJ69r90T9Q.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-f8NnOdAnwZoe2uflNoLBRDBhZPnKmKZe0olZs_wVNcQ.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-SRH5ih2qv5UBG9VvjFJu-mOp3feKtUL1Ysc0JKYOpnA.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-0MljTAkuc7dpwRdJceh1YEbELhKrWCo3jeNiKRRm2mk.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-6T54Q0QAV4yA_4TMHRJPqqdWuBJN5oLdnJinChSnWx8.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8413457401368662259.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MdCfwo_AtgMJuzQ7zV2aYKxV8F50_gEnGQ2EYYiHThE.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-ZDVISa7bPKLAFrQ15Z8b395NihMP0BWWuKxNFiC1UB8.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-dVr3k4NUhLWaqqOeDpCiwajfCaicbPgnyWSWihAARqg.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-hYSUplqj1D9nD1WMLFwNYBtjitpzbFvZprxJgKznSp0.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-euODWnp6WD9gBhZ2FR6BpJKZtS7S4pUxcIsyQ8zd-gU.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-CaxWDGqRqXv3PVcykJeNApCUm8o3MDOAmuPdSRf8PPM.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-jm3wd31UhQLU6Yv_56uVvX9SlX6u1OaHCIQfFUsc6PY.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-W-CVQqk7FEOnUvKRHE9YXN6CrF1kQSTs6NFHb_1xfC0.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-9-tmdDe9MHdw-MPDsaAkugFanQhDq5cIqUc_W4vBS9I.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-1wJisqY5FjNAGw2zz0ORySCV4xig-FlYHCTFpyhWrrs.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-lgI3X1aa4Z5MvOA_RizD6bCxA9C3wcnWKNpjQPihdnE.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-OOZnnckt4A9Cp2AH7IKXjokdAS6vYt501WUak_YxELY.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-1rjheoBBQrQv5SQzpLxjZtA_2Tth_kcW5C6bfmX3DyA.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-1d4Hfi6d4QakhkGgsCcdENY4D81iR06LktsZ4CGfemY.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-F-FoYqMTF2qQv11M0tdA6a3g5AuLP2sfi84SCvjUWAM.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-rcsYtVQtA-G_VNd2SA2NU0_TFUcXZM50ZqPSZVThlaY.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-JzZDfgzATDTg23ZF9BIY9_6_9O5xTpzpsdz7tQbYw6Q.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-606pJ5J0AaTta_1GhWkb1bCfVdyh3D1CDLpgcr2hOcU.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-G7c5xzOBKLnrhyN9lyYuvUN1S61ayeXF5IQgtE15jps.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-XQ-3bEFbt8f3ExTsZkbuIqvXBgSrVZMiqnIt4vxK0eM.jar
    Jul 31, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-qoWPH2aUleGC1Zoqjvu-dXrdkni6VbaHizM0FFQEovQ.jar
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 11 seconds
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 31, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93803 bytes, hash 156c7d19c19da911fa899060d584bf4f8552c33410f73519089467c7ebf83115> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FWx9GcGdqRH6iZBg1YS_T4VSwzQQ9zUZCJRnx-v4MRU.pb
    Jul 31, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_17_45_32-10810651197771964804?project=apache-beam-testing
    Jul 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-30_17_45_32-10810651197771964804
    Jul 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-30_17_45_32-10810651197771964804
    Jul 31, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T00:45:32.630Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 31, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:42.393Z: Worker configuration: n1-standard-1 in us-central1-a.
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.162Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.251Z: Expanding GroupByKey operations into optimizable parts.
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.291Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.513Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.547Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.576Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:43.603Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:44.276Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:45:44.352Z: Starting 5 workers in us-central1-a...
    Jul 31, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-31T00:45:59.381Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 31, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:46:12.259Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Jul 31, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:46:12.292Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Jul 31, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:46:17.705Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Jul 31, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:46:33.269Z: Workers have started successfully.
    Jul 31, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:46:33.304Z: Workers have started successfully.
    Jul 31, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:47:16.822Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 31, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:47:17.149Z: Cleaning up.
    Jul 31, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:47:17.271Z: Stopping worker pool...
    Jul 31, 2020 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:48:21.710Z: Autoscaling: Resized worker pool from 5 to 0.
    Jul 31, 2020 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-31T00:48:21.805Z: Worker pool stopped.
    Jul 31, 2020 12:48:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-30_17_45_32-10810651197771964804 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 42cef3a6-0b59-4a33-9f5c-9404b47b611b and timestamp: 2020-07-31T00:48:27.762000000Z:
                     Metric:                    Value:
                   read_time                    22.163
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 31, 2020 12:48:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.046 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 19.552 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/p2flrvgquesls

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #811

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/811/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10331] Add SnowflakeIO to list of built-in IO transforms. (#12099)


------------------------------------------
[...truncated 293.12 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Jul 30, 2020 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 30, 2020 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 30, 2020 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Jul 30, 2020 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Jul 30, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Jul 30, 2020 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-dETkY56i0-G_bbAtwB0N4ChqZ7K1ruOXvGo0F_RgaTg.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-unshaded-9rQd3CiYNwO5_gNZowPCfciyaq4RQCXeNv2uuFXa7k0.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.24.0-SNAPSHOT-ugPvMWbptnLL55SywFU6PmoqOZwl6HVY3iprZbyrQko.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-uKITfY2hWfGpECrJ6lJF6rmrPypujEuxedgvUEzQL5c.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-9AC4k67hUp_-1k-9hRd1-UmtNPbFMFzO50_i9kW3cFA.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-l3SEi6BgYx22KXgr-nfJQqkd0XYELYHCso0cGvRkB0g.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-cUM6u9_fX39-jqFs9qGt9CX9ppSJwYPMagjJedShKUo.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test34899785718133289.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_Q-PIZs_PDBDuU6JU6FEEWPH-iQZr0yy4HQwiUdvQPc.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-qoDQr4iP_6_1BzHwnFrjnwCtlttIzB876OTpH46Vyb4.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-lvkmgvUOO05xx7vXEDvZVKTGPpDngCdZ2v_ZRiDYd1Y.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-BrS4C8Hup4i_000GrspYRrzW-vwSpc5rC9szs0r0mZs.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-zMbR05kIFG8Z0WKQJ9fy8SB4mrKmDJpJe6onDH57TOc.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-B6FY0zQJhRpJ-YiGBJ7vD7UJAdSkJzF-GlU7UGjahYI.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-tests-tsKsS_TJBAItgJoupKtZsqfpQydCUPiT3Hx8hN4TxFQ.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.24.0-SNAPSHOT-dETkY56i0-G_bbAtwB0N4ChqZ7K1ruOXvGo0F_RgaTg.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-tests-a3m7XoqC0OEmwH310Y5zuJvo6Tmwsv7y5hKsmK022jA.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.24.0-SNAPSHOT-5bohTmaSN2Fy_jS7uXLzY78Sgj7327EBTsa1rJcPjQ0.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.24.0-SNAPSHOT-e74RAMlXIHxNc8ci4YvK6DwjokdyEIm4KVYAgnIUJ38.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-gTznYOdFLREA8_esoCcCJYLPBNE5J1RFXmoDqKXyqsE.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-U51irujH3GuTH8fBTKfPmi2vWLZpnpFHUmFOYc1S4lU.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-dK4HqK-Ylburpj3tuqq7NUqXEhHxmHjTwDruvpdM_C0.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-uPZRKffa5jFu0ZL0rHR6cpsJLRCyriqvNDaoJyjW_6s.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-yfPGdOOoyOwCBVse-iPN8yayo9yg3-2hIwHaB3ATP0U.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-SpsAM9ZhMYDqFd3tzCgG8WhuBhpny2POITuFclxr9Po.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-hYd97-p7B4Nz4rrIpR_6tHuz8QNeK72tsEkPPv3P-No.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-MTDZvq5T-Wm6es7QhSCN_pXT-65iPtFLjZCfsmU2OTQ.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-ZxfmKotjlGy606UZG1aMx5E_e6PGFJ2-GccbtBvEr6w.jar
    Jul 30, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-dCAqU7QYzzeKH-TbgPxEPxnKdvyBqaCtHk_C8vD_SSw.jar
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-VHFAlZnFPz-FPDOZ9YIztOOKsAemyovpDfhE9MiNwf4.jar
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-lNMf41jFqdWY2IbSLkJYqXAU2Beo-1kAx4GBSlogVms.jar
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-iTTtHFZyU3rn2QQmmT8_gTUczp1i8JbjGGdNecE-5dc.jar
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 185 files cached, 30 files newly uploaded in 1 seconds
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 30, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <93809 bytes, hash 84c36e2abe2279ffcb6a031ce6ebd3a1764a3db25cdd07f44abc0500b4f759ac> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hMNuKr4ief_LagMc5uvToXZKPbJc3Qf0SrwFALT3Waw.pb
    Jul 30, 2020 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 30, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-30_11_45_14-7814684426474779295?project=apache-beam-testing
    Jul 30, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-30_11_45_14-7814684426474779295
    Jul 30, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-30_11_45_14-7814684426474779295
    Jul 30, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-30T18:45:14.311Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 30, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:22.519Z: Worker configuration: n1-standard-1 in us-central1-f.
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.218Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.393Z: Expanding GroupByKey operations into optimizable parts.
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.427Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.522Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.548Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.571Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.593Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.909Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 30, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:23.969Z: Starting 5 workers in us-central1-f...
    Jul 30, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-30T18:45:34.683Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 30, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:45:56.658Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Jul 30, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:46:15.322Z: Workers have started successfully.
    Jul 30, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:46:15.366Z: Workers have started successfully.
    Jul 30, 2020 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:46:52.444Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Jul 30, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:46:52.647Z: Cleaning up.
    Jul 30, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:46:52.738Z: Stopping worker pool...
    Jul 30, 2020 6:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:47:43.597Z: Autoscaling: Resized worker pool from 5 to 0.
    Jul 30, 2020 6:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T18:47:43.634Z: Worker pool stopped.
    Jul 30, 2020 6:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-07-30_11_45_14-7814684426474779295 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8178d8d5-6d78-4583-a177-8815ae86028f and timestamp: 2020-07-30T18:47:50.721000000Z:
                     Metric:                    Value:
                   read_time                    16.517
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Jul 30, 2020 6:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 49.819 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ogp7e57uqb4qq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org